专利摘要:
METHOD AND SYSTEM FOR MANUAL CONTROL OF A MINIMUMLY INVASIVE TELEOPERATED AUXILIARY SURGICAL INSTRUMENT. The present invention relates to a minimally invasive surgical system, a hand tracking system that tracks a location of a sensor element mounted on part of a human hand. A system control parameter is generated based on the location of the human hand part. The operation of the minimally invasive surgical system is controlled using the system control parameter. Thus, the minimally invasive surgical system includes a hand tracking system. the hand tracking system tracks a location of part of a human hand. A controller attached to the hand tracking system converts the location into a system control parameter, and injects a command into the minimally invasive surgical system based on the system control parameter.
公开号:BR112012011321B1
申请号:R112012011321-6
申请日:2010-11-11
公开日:2020-10-13
发明作者:Brandon D. Itkowitz;Simon Dimaio;Tao Zhao;Karlin Y. Bark
申请人:Intuitive Surgical Operations, Inc;
IPC主号:
专利说明:

Related Order Cross Reference
[0001] This application is partly a continuation of US patent application number 12 / 617,937 (filed on November 13, 2009, which describes "Patient-Side Surgeon Interface For a Minimally Invasive Teleoperated Surgical Instrument"), which is incorporated into this document as a reference in its entirety. Background of the Invention Field of the Invention
[0002] Aspects of this invention refer to the control of minimally invasive surgical systems and, more particularly, they refer to the use of a surgeon's manual movement in the control of a minimally invasive surgical system. Related Technique
[0003] Methods and techniques for tracking manual positions and gestures are known. For example, some game controllers use manual tracking input. For example, the Nintendo Wii® gaming platform supports wireless position capture and guidance remote controls. (Wii is a registered trademark of Nintendo of America Inc, Redmond Washington, U.S.A.). The use of gestures and other physical movements, such as swinging a stick or waving a magic wand provides the fundamental game element for this platform. Sony Playstation Move has features similar to those of the Nintendo Wii® gaming platform.
[0004] A wireless CyberGlove® motion capture data glove available from CyberGlove Systems includes eighteen data sensors with two flexion sensors on each finger, four abduction sensors and sensors that measure the crossing of the thumb, arch palm, pulse flexion and pulse abduction. (CyberGlove® is a registered trademark of CyberGlove Systems LLC of San Jose, CA.). When a three-dimensional tracking system is used with the CyberGlove® motion capture data glove, position and orientation information x, y, z, yaw, pitch, hand roll is available. The motion capture system for the CyberGlove® motion capture data glove is used in digital prototype evaluation, virtual reality biomechanics and animation.
[0005] Another data glove with forty sensors is the ShapeHand data glove, available from Measurand Inc. A portable lightweight manual motion capture system ShapeClaw, available from Measurand Inc. includes a system of flexible tapes that capture movement of the index finger and thumb along with the position and orientation of the hand and forearm in space.
[0006] In In-Cheol Kim and Sung-Il Chien, "Analysis of 3D Hand Trajectory Gestures Using Stroke-Based Composite Hidden Markov Models", Applied Intelligence, Vol. 15 Number. 2, p. 131-143, September-October 2001, Kim and Chien explore the use of three-dimensional trajectory input with a Polhemus sensor for gesture recognition. Kim and Chien propose this form of entry because three-dimensional trajectories offer more selective energy than two-dimensional gestures, which are predominantly used in video-based approaches. For their experiments, Kim and Chien used a Polhemus magnetic position tracking sensor attached to the back of a Fakespace PinchGlove. PinchGlove provides a means for the user to signal the beginning and end of a gesture, while the Polhemus sensor captures the three-dimensional trajectory of the user's hand.
[0007] In Elena Sanchez-Nielsen, et al., "Manual Gesture Recognition for Human-Machine Interaction", Journal of WSCG, Vol. 12, No. 1-3, ISSN 1213-6972, WSCG'2004, 2-6 February 2003, Plzen Czech Republic, has proposed a real-time vision system for application in visual interaction environments through the recognition of hand gestures that it uses general purpose hardware and low-cost sensors, such as a personal computer and a digital video camera (web cam). In Pragati Garg, et al., "Vision Based Manual Gesture Recognition", 49 World Academy of Science, Engineering and Technology, 972-977 (2009), a review of vision-based manual gesture recognition was presented. One conclusion presented noted that most approaches depend on several underlying assumptions that may be adequate in a controlled laboratory, but not generalized in arbitrary environments. The authors established that "Computer Vision methods for hand gesture interfaces must surpass current performance in terms of robustness and speed to achieve interactivity and usability". In the medical field, gesture recognition is considered for sterile navigation of radiological images. See Juan P. Wachs, et al., "A Gesture-based Tool for Sterile Browsing of Radiology Images", Journal of the American Medical Informatics Association (2008; 15: 321-323, DOI 10.1197 / jamia.M24). summary
[0008] In one aspect, a manual tracking system in a minimally invasive surgical system tracks a location of part of a human hand. A system control parameter for the minimally invasive surgical system is generated based on the location of the human hand part. The operation of the minimally invasive surgical system is controlled using the system control parameter.
[0009] In one aspect, the sensor elements mounted on part of a human hand are tracked to obtain the locations of the part of the human hand. A position and orientation for a control point are generated based on location. The teleoperation of a device in a minimally invasive surgical system is controlled based on the position and orientation of the control point. In one aspect, the device is a teleoperated auxiliary surgical instrument. In another aspect, the device is a virtual proxy presented in a video image of a surgical site. Examples of a virtual proxy include a virtual auxiliary surgical instrument, a virtual hand and a virtual remote medical device.
[00010] In an additional aspect, a retainer closing parameter is generated in addition to the position and orientation of the control point. The retainer of an end effector of the teleoperated auxiliary surgical instrument is controlled based on the retainer closure parameter.
[00011] In another aspect, the system control parameter is a position and orientation of a control point used for the teleoperation of the auxiliary surgical instrument. In yet another aspect, the system control parameter is determined from two hands. The system control parameter is a position and orientation from a control point to one of the two hands, and a position and orientation from a control point to the other of the two hands. The control points are used for teleoperating an endoscopic camera manipulator in the minimally invasive surgical system.
[00012] In yet another aspect, the sensor elements mounted on part of a second human hand are tracked in addition to the sensor elements on the part of the human hand. A position and orientation for a second control point are generated based on the location of the second human hand part. In this regard, both the control point and the second control point are used in teleoperation control.
[00013] In yet another aspect, the sensor elements mounted on the digits of a human hand are tracked. A movement between the digits is determined, and the orientation of a teleoperated auxiliary surgical instrument in a minimally invasive surgical system is controlled based on the movement.
[00014] When the movement is a first movement, control includes turning a tip of an auxiliary-wrist surgical instrument around its pointing direction. When the movement is a second movement different from the first movement, the control includes the yaw movement of the auxiliary wrist surgical instrument.
[00015] A minimally invasive surgical system includes a manual tracking system and a controller attached to the manual tracking system. The manual tracking system tracks the locations of a plurality of sensor elements mounted on part of a human hand. The controller transforms the locations into a position and orientation from a control point. The controller sends a command to move a device in a minimally invasive surgical system based on the control point. Again, in one aspect, the device is a teleoperated auxiliary surgical instrument while, in another aspect, the device is a virtual proxy presented in a video image of a surgical site.
[00016] In one aspect, the system also includes a main finger tracking device that includes the plurality of tracking sensors. The main finger tracking device additionally includes a compressible body, a first finger strap attached to the compressible body, and a second finger strap attached to the compressible body. A first tracking sensor in the plurality of tracking sensors is connected to the first finger strap. A second tracking sensor in the plurality of tracking sensors is connected to the second finger strap.
[00017] Therefore, in one aspect, a minimally invasive surgical system includes a primary finger tracking device. The main finger tracking device includes a compressible body, a first finger strap attached to the compressible body, and a second finger strap attached to the compressible body. A first tracking sensor is attached to the first finger strap. A second tracking sensor is connected to the second finger strap.
[00018] The compressible body includes a first end, a second end and an outer surface. The outer surface includes a first portion that extends between the first and second ends, and a second portion, opposite and removed from the first portion, that extends between the first and second ends.
[00019] The compressible body also has a length. The length is selected to limit a separation between a first digit and as a second digit in the human hand.
[00020] The first finger strap is attached to the compressible body adjacent to the first end and extends around the first portion of the outer surface. By placing the first finger loop on a first digit of a human hand, a first part of the first portion of the outer surface comes into contact with the first digit.
[00021] The second finger loop is attached to the compressible body adjacent to the second end and extends around the first portion of the outer surface. By placing the second finger loop on a second digit of the human hand, a second part of the first portion of the outer surface comes into contact with the second digit. By moving the first and second digits towards each other, the compressible body is positioned between the two digits, so that the compressible body provides resistance to movement.
[00022] A thickness of the compressible body is selected, so that when the tip of the first digit only touches a tip of the second digit, the compressible body is less than fully compressed. The compressible body is configured to provide haptic feedback that corresponds to a gripping force of a final actuator of a teleoperated auxiliary surgical instrument.
[00023] In one aspect, the first and second tracking sensors are passive electromagnetic sensors. In an additional aspect, each passive electromagnetic tracking sensor has six degrees of freedom.
[00024] One method of using the primary finger tracking device includes tracking a first location of a sensor mounted on a first digit in a human hand and a second location of another sensor mounted on a second digit. Each location has N degrees of freedom, where N is an integer greater than zero. The first location and the second location are mapped to a checkpoint location. The control point location has six degrees of freedom. The six degrees of freedom are less than or equal to 2 * N degrees of freedom. The first location and the second location are also mapped to a parameter that has a single degree of freedom. The teleoperation of an auxiliary surgical instrument in a minimally invasive surgical system is controlled based on the location of the control point and the parameter.
[00025] In a first aspect, the parameter is a retainer closing distance. In a second aspect, the parameter comprises an orientation. In another aspect, N is six, while in a different aspect, N is five.
[00026] Still in an additional aspect, the sensor elements mounted on part of a human hand are tracked to obtain a plurality of locations on the part of the human hand. A manual gesture from a plurality of known manual gestures is selected based on the plurality of locations. The operation of a minimally invasive surgical system is controlled based on the manual gesture.
[00027] The manual gesture can be any one between a manual gesture position, a manual gesture path or a combination of a manual gesture position and a manual gesture path. When the hand gesture is a hand gesture position and the plurality of known hand gestures includes a plurality of hand gesture positions, a minimally invasive surgical system user interface is controlled based on the hand gesture position.
[00028] Furthermore, in one aspect, when the manual gesture is in the position of manual gesture, the selection of manual gesture includes generating a set of resources observed from the plurality of tracked locations. The observed feature set is compared to the feature sets from the plurality of known manual gesture positions. One of the known hand gestures is selected as the hand gesture position. The selected known hand gesture position is mapped to a system command, and the system command is triggered on the minimally invasive surgical system.
[00029] Still in an additional aspect, when the manual gesture includes a trajectory of manual gesture, the user interface of the minimally invasive surgical system is controlled based on the trajectory of manual gesture.
[00030] In the minimally invasive surgical system with the manual tracking systems and the controller, the controller transforms the scanned locations into a manual gesture. The controller sends a command to modify a minimally invasive operating system operation mode based on the manual gesture.
[00031] In yet another aspect, a sensor element mounted on a part of a human is tracked to obtain a location of the part of the human hand. Based on location, the method determines whether a position of the human hand part is within a limit distance from a position of a main tool retainer in a minimally invasive surgical system. The operation of the minimally invasive surgical system is controlled based on a result of the determination. In one aspect, the teleoperation of a teleoperated auxiliary surgical instrument coupled to the main tool retainer is controlled based on a determination result. In another aspect, the display of a user interface, or display of a visual proxy is controlled based on the result of the determination.
[00032] In one aspect, the position of the human hand part is specified by a control point position. In another aspect, the position of the part of the human hand is a position of the index finger.
[00033] A minimally invasive surgical system includes a manual tracking system. The manual tracking system tracks a location of part of a human hand. A controller uses location to determine if a surgeon's hand is close enough to a main tool retainer to allow a particular operation of the minimally invasive surgical system.
[00034] A minimally invasive surgical system also includes a controller attached to the manual tracking system. The controller converts the location into a system control parameter, and injects a command based on the system control parameter into the minimally invasive surgical system. Brief Description of Drawings
[00035] Figure 1 is a high-level diagrammatic view of a minimally invasive teleoperated surgical system that includes a manual tracking system.
[00036] Figures 2A to 2G are examples of several configurations of a hand-held main tool retainer used to control a teleoperated auxiliary surgical instrument from the minimally invasive teleoperated surgical system of figure 1.
[00037] Figures 3A to 3D are examples of manual gesture positions used to control system modes in the minimally invasive teleoperated surgical system of figure 1.
[00038] Figures 4A to 4C are examples of manual gesture trajectories that are also used to control the system modes in the minimally invasive teleoperated surgical system of figure 1.
[00039] Figure 5 is an illustration of placing fiducial markers for hand tracking in a camera based tracking system.
[00040] Figures 6A and 6B are more detailed diagrams of the surgeon's console in figure 1, and include examples of coordinate systems used in hand tracking by the minimally invasive teleoperated surgical system in figure 1.
[00041] Figure 7 is a more detailed illustration of a main tool retainer used by hand and the tracked locations and coordinate systems used in hand tracking by the minimally invasive teleoperated surgical system in figure 1.
[00042] Figure 8 is a process flow chart of a process used in the tracking system to track hand digits and used to generate data for the teleoperation of an auxiliary surgical instrument in the minimally invasive teleoperated surgical system of figure 1.
[00043] Figure 9 is a more detailed process flow chart of the MAP LOCATION DATA TO CONTROL POINT AND AND GRIP PARAMETER process in figure 8.
[00044] Figure 10 is a process flow chart of a process used in the tracking system to recognize manual gesture positions and manual gesture trajectories.
[00045] Figure 11 is a process flow chart of a process used in the tracking system to detect the presence of hand.
[00046] Figure 12 is an illustration of an example of a primary finger tracking device.
[00047] Figure 13 is an illustration of a video image, presented on a display device, which includes a visual proxy that, in this example, includes a virtual phantom instrument, and a teleoperated auxiliary surgical instrument.
[00048] Figure 14 is an illustration of a video image, presented on a display device, which includes visual proxies that, in this example, include a pair of virtual hands, and teleoperated auxiliary surgical instruments.
[00049] Figure 15 is an illustration of a video image, presented on a display device, which includes visual proxies, which in this example include a virtual remote medicine device and a virtual phantom instrument, and teleoperated auxiliary surgical instruments.
[00050] In the drawings, the first digit of a three-digit reference number indicates the figure number of the figure in which the element with this reference number appeared first and the first two digits of a four-digit reference number indicated show the figure number of the figure in which the element with this reference number appeared first. Detailed Description
[00051] As used in this document, a location includes a position and an orientation.
[00052] As used in this document, a manual gesture, sometimes called a gesture, includes a manual gesture position, a manual gesture path, and a combination of the manual gesture position and a manual gesture path.
[00053] Aspects of this invention increase the ability to control minimally invasive surgical systems, for example, the da Vinci® minimally invasive teleoperated surgical system, marketed by Intuitive Surgical, Inc. of Sunnyvale, California, using hand location information in the control of the minimally invasive surgical system. A measured location of one or more digits of the hand is used to determine a system control parameter which, in turn, is used to trigger a system command in the surgical system. System commands depend on the location of the person whose hand location is being tracked, that is, whether the person is on a surgeon's console.
[00054] When the measured tracked locations are for digits of a hand of a person who is not on a surgeon's console, the system commands include a command to change the orientation of a part of a teleoperated auxiliary surgical instrument based on a combination of hand orientation and relative two-digit movement of a hand, and a command to move a tip of a teleoperated auxiliary surgical instrument so that the tip movement follows the movement of a part of the hand. When the measured tracked locations are for the digits of a person's hand on a surgeon's console, the system commands include commands that allow or prevent the movement of an auxiliary surgical instrument to continue to follow the movement of a main tool retainer. . When the measured tracked locations are for the fingers of a person's hand that is not on a surgeon's console, or the digits of a person's hand on a surgeon's console, system commands include commanding the system, or a part of the system to take an action based on a manual gesture position, and command the system or a part of the system to take an action based on a manual gesture path.
[00055] Figure 1 is a high level diagrammatic view of a minimally invasive teleoperated surgical system 100, for example, the da Vinci® Surgical System, which includes a manual tracking system. There are other parts, cables, etc., associated with the da Vinci® Surgical System, however, these are not illustrated in figure 1 to avoid depreciation of the description. Additional information regarding minimally invasive surgical systems can be found, for example, in US patent application number 11 / 762,165 (deferred on June 13, 2007, which describes "Minimally Invasive Surgical System"), and order US patent number 6,331,181 (issued December 18, 2001, which describes Surgical Robotic Tools, Data Architecture, And Use "), both of which are incorporated into this document for reference. See also, for example, US patent numbers 7,155,315 (filed December 12, 2005; which describes "Camera Referenced Control In A Minimally Invasive Surgical Apparatus") and 7,574,250 (filed February 4, 2003; which describes "Image Shifting Apparatus And Method For A Telerobotic System "), which are both incorporated by reference.
[00056] In this example, system 100 includes a cart 110 with a plurality of handlers.
[00057] Each manipulator and the teleoperated auxiliary surgical instrument controlled by this manipulator can be coupled and uncoupled from the main tool manipulators on the surgeon's console 185 and, in addition, they can be coupled and uncoupled from the main finger tracking retainer mechanically ungrounded unmoved 170, sometimes called the main finger tracking retainer 170.
[00058] A stereoscopic endoscope 112 mounted on the manipulator 113 provides an image of the surgical site 103 within the patient 111 which is displayed on screen 187 and on the screen on the surgeon's console 185. The image includes images of any of the auxiliary surgical devices in the field of vision of the stereoscopic endoscope 112. The interactions between the main tool manipulators on the surgeon's console 185, the auxiliary surgical devices and the 112 stereoscopic endoscope are the same as a known system and are therefore known to those skilled in the field.
[00059] In one aspect, the surgeon 181 moves at least one digit of the surgeon's hand which, in turn, causes a sensor on the main finger tracking retainer 170 to change the location. The hand tracking transmitter 175 provides a field, so that the new position and orientation of the digit is captured by the main finger tracking retainer 170. The new position and orientation captured is provided to manipulate the tracking controller 130.
[00060] In one aspect, as explained more fully below, the hand tracking controller 130 maps the captured position and orientation to a checkpoint position and a checkpoint orientation into a surgeon eye coordinate system 181 Hand tracking controller 130 sends this location information to system controller 140 which, in turn, sends a system command to the teleoperated auxiliary surgical instrument coupled to main finger tracking retainer 170. As explained more fully below, using main finger tracking retainer 170, surgeon 181 can control, for example, the retainer of a final actuator of the teleoperated auxiliary surgical instrument, as well as the roll and yaw of a pulse attached to the final actuator.
[00061] In another aspect, the hand tracking of at least part of the surgeon's hand 181 or the surgeon's hand 180 is used by the hand tracking controller 130 to determine whether a hand gesture position is performed by the surgeon, or a combination of a hand gesture position and a hand gesture path is performed by the surgeon. Each hand gesture position and each trajectory with a hand gesture position is mapped to a different system command. System commands control, for example, changes in system mode and control other aspects of the minimally invasive surgical system 100.
[00062] For example, instead of using pedals and switches as in a known minimally invasive surgical system, a manual gesture, a manual gesture position or a manual gesture path, is used (i) to start between the movements of the retainer of the main tool and the associated teleoperated auxiliary surgical instrument, (ii) for activating the main clutch (which decouples the main control of the auxiliary instrument), (iii) for endoscopic camera control (which allows the main to control movement or endoscope features, such as focus or electronic zoom), (iv) for robotic arm swapping (which swaps a particular master control between two auxiliary instruments), and (v) for TILEPRO ™ swap, (which alternates window display video assistants on the surgeon's screen). (TILEPRO is a registered trademark of Intuitive Surgical, Inc. of Sunnyvale, CA, USA.)
[00063] When there are only two main tool retainers in system 100 and the surgeon 180 wishes to control the movement of an auxiliary surgical instrument other than the two teleoperated auxiliary surgical instruments coupled to the two main tool retainers, the surgeon can lock one or both instruments auxiliary surgical teleoperators in place using a first manual gesture. The surgeon then associates one or both main tool retentors with other auxiliary surgical instruments retained by the other manipulator arms using a different hand gesture, which in this implementation provides an association between changing the main tool retainer and another auxiliary surgical instrument. teleoperated. Surgeon 181 performs an equivalent procedure when there are only two main finger tracking retainers in system 100.
[00064] In yet another aspect, a hand tracking unit 186 mounted on the surgeon's console 185 tracks at least part of the surgeon's hand 180 and sends the captured location information to the hand tracking controller 130. The hand tracking controller hand tracking 130 determines when the surgeon's hand is close enough to the main tool retainer to allow the system to follow, for example, the movement of the auxiliary surgical instrument that follows the movement of the main tool retainer. As explained more fully below, in one aspect, the hand tracking controller 130 determines the position of the surgeon's hand and the position of the corresponding main tool retainer. If the difference in the two positions is within a predetermined distance, for example, less than a limit separation, the following is allowed and, otherwise, the next is inhibited. In this way, the distance is used as a measure of the presence of the surgeon's hand in relation to the main tool retainer on the surgeon's console 185. In another aspect, when the position of the surgeon's hand in relation to the position of the tool retainer is less than the limit separation, the display of a user interface on a screen is inhibited, for example, turned off on a display device. Conversely, when the position of the surgeon's hand in relation to the position of the main tool retainer is greater than the limit separation, the user interface is displayed on the display device, for example, on.
[00065] Detecting the presence of the surgeon's hand has been a long-standing problem.
[00066] Presence detection has been attempted many times using different contact pickup technologies, such as capacitive switches, pressure sensors and mechanical switches. However, these approaches are inherently problematic because surgeons have different preferences in how and where they hold the main tool retainer. Using distance as a presence measurement is advantageous because this type of presence detection allows the surgeon to lightly touch the main tool retainer and then momentarily interrupt physical contact to adjust the main tool retainer, however, this does not limits how the surgeon holds the main tool retainer with his fingers. Surgical Instrument Control through Hand Tracking
[00067] An example of a non-actuated mechanically ungrounded main finger tracking retainer 270, sometimes called the main finger tracking retainer 270, is illustrated in figures 2A to 2D in different configurations that are more fully described below. Main finger tracking retainer 270 includes sensors mounted at the digit 211, 212, sometimes referred to as sensors mounted on the finger or thumb 211, 212, which independently track the location (position and orientation in an example) of a tip of a 292B index finger and a 292A thumb tip, that is, track the two-digit location of the surgeon's hand. In this way, the location of the hand itself is tracked as opposed to tracking the location of the main tool retainers in a known minimally invasive surgical system.
[00068] In one aspect, the sensors provide tracking of six degrees of freedom (three of translation and three of rotation) for each digit of the hand on which a sensor is mounted. In another aspect, the sensors provide the tracking of five degrees of freedom (three of translation and two of rotation) for each digit of the hand in which a sensor is mounted.
[00069] In yet another aspect, the sensors provide the tracking of three degrees of freedom (three of translation) for each digit of the hand in which a sensor is mounted. When two digits are traced with three degrees of freedom, the total of six degrees of translational freedom is sufficient to control an auxiliary surgical instrument that does not include a wrist mechanism.
[00070] A padded foam connector 210 is connected between the sensors mounted on the finger or thumb 211, 212. The connector 210 limits the thumb 292A and the index finger 292B, that is, the digits of the hand 291R, to meet at a fixed distance, i.e., there is a maximum separation distance between the 291R hand digits on which the main finger tracking retainer 270 is mounted. As the thumb 292A and index finger 292B are moved from maximum separation (figure 2A) to a completely closed configuration (figure 2D), the padding provides positive feedback to help surgeon 181 control grip strength of a final actuator of a teleoperated auxiliary surgical instrument coupled to the main finger tracking retainer 170.
[00071] For the position illustrated in figure 2A with the thumb 292A and the index finger 292B separated by the maximum distance allowed by the main finger tracking retainer 270, the gripping force is minimal. Conversely, in the position illustrated in figure 2D where thumb 292A and index finger 292 are as close as allowed by connector 210, that is, separated by a minimum distance allowed by main finger tracking retainer 270, the force of hold is maximum. Figures 2B and 2C represent the positions that are mapped to intermediate the gripping forces.
[00072] As explained more fully below, the tracked locations (positions and orientations) of the thumb 292A and index finger 292B in figures 2A to 2D are mapped into a retainer close parameter, for example, a retainer close value standard that is used to control the retainer of the teleoperated auxiliary surgical instrument coupled to the main finger tracking retainer 270. Specifically, the tracked locations captured from the thumb 292A and the index finger 292B are mapped to the retainer closure parameter by hand tracking controller 130.
[00073] In this way, a location of a part of the hand of the surgeon 181 is tracked. Based on the tracked location, a system control parameter of the minimally invasive surgical system 100, that is, a retainer closure parameter, is generated by hand tracking controller 130, and supplied to system controller 140. The controller System 140 uses the retainer closure parameter when generating a system command that is sent to the teleoperated auxiliary surgical instrument. The system command instructs the teleoperated surgical instrument to configure a final actuator to have a retainer closure that corresponds to the retainer closure parameter. Therefore, the minimally invasive surgical system 100 uses the retainer closure parameter to control the operation of the teleoperated auxiliary surgical instrument of the minimally invasive surgical system 100.
[00074] Also, the tracked locations (position and orientation) of the thumb 292A and index finger 292B in figures 2A to 2D are mapped into a control point position and a control point orientation by the hand tracking controller 130 The checkpoint position and checkpoint orientation are mapped in an eye coordinate system to the surgeon 181 and then provided to the system controller 140 via a command signal. The control point position and control point orientation in the eye coordinate system are used by the system controller 140 for teleoperating the auxiliary surgical instrument coupled to the main finger tracking retainer 170.
[00075] Again, a location of part of the surgeon's hand 181 is tracked. Based on the location tracked, another system control parameter of the minimally invasive surgical system 100, that is, the position and orientation of the control point, is generated by the hand tracking controller 130. The hand tracking controller 130 transmits a command signal with control point position and orientation for system controller 140. System controller 140 uses control point position and orientation in generating a system command that is sent to the teleoperated auxiliary surgical instrument. The system command instructs the teleoperated surgical instrument to position the teleoperated surgical instrument based on the position and orientation of the control point. Therefore, the minimally invasive surgical system 100 uses the control point position and orientation to control the operation of the teleoperated auxiliary surgical instrument of the minimally invasive surgical system.
[00076] In addition to determining the retainer closure based on sensor positions 211, 212, another relative movement between the index finger 292B and the thumb 292A is used to control the yaw movement and the roll movement of the auxiliary surgical instrument . The transverse friction of the finger 292B and thumb 292A as it rotates an axis, which is represented by the arrows (figure 2E) around an imaginary axis 293, causes the tip of the auxiliary surgical instrument to roll, while sliding of the index finger and the thumb up and back in the direction of the length along each other, which is represented by the arrows in (figure 2F) along a geometric axis in the pointing direction represented by the arrow 295, produces the movement yaw along the X axis of the auxiliary surgical instrument. This is achieved by mapping the vector between the index finger tip and thumb tip positions to define the geometric X axis of the control point orientation. The position of the control point remains relatively stationary, as the finger and the thumb slide symmetrically along the 295 geometry axis. Although the movements of the finger and the thumb do not move completely symmetrically, the position still remains sufficiently stationary that the user can easily correct any disturbance that may occur.
[00077] Again, the tracked locations of part of the surgeon's hand 181 are tracked. Based on the tracked locations, yet another system control parameter, that is, the relative movement between two digits of the 291R surgeon's hand, is generated by the hand tracking controller 130.
[00078] Hand tracking controller 130 converts relative motion into an orientation for the teleoperated auxiliary surgical instrument coupled to main finger tracking retainer 170. Hand tracking controller 130 sends a command signal with guidance to the system controller 140. Although this orientation is an absolute orientation mapping, system controller 140, in one respect, uses this ratchet entry during teleoperation in the same way as a guidance entry from any other main tool retainer of passive gyroscope. An example of a ratchet is described in the US patent application assigned to the same applicant number 12 / 495,213 (filed on June 30, 2009, which describes "Ratcheting For Master Alignment Of A Teleoperated Surgical Instrument"), which is incorporated in this document as a reference in its entirety.
[00079] System controller 140 uses guidance in generating a system command that is sent to the teleoperated auxiliary surgical instrument. The system controller instructs the teleoperated surgical instrument to rotate the teleoperated surgical instrument based on the orientation. Therefore, the minimally invasive surgical system 100 uses movement between the two digits to control the operation of the teleoperated auxiliary surgical instrument of the minimally invasive surgical system 100.
[00080] When the movement is a first movement, for example, transverse friction of the finger 292B and thumb 292A as it rotates an axis, the orientation is a roll, and the system command results in a tip roll of the instrument auxiliary pulse surgery along its pointing direction. When the movement is a second movement different from the first movement, for example, the sliding of the index finger and the thumb up and back along the length (figure 2F), the orientation is a turn, and the system control results in a yaw motion of the auxiliary pulse surgical instrument.
[00081] In yet another aspect, when the surgeon changes the mode of operation of the system to a gesture recognition mode, both hands are tracked and the control points and orientations for both hands are generated based on positions and orientations captured from hand-mounted sensors in one aspect. For example, as shown in figure 2G, the tips of the thumb and index finger of each hand are touched together to form a circular shape. The captured position of each hand is mapped by the hand tracking controller 130 to a pair of control point positions. The control point pair is sent with a camera control system event to system controller 140.
[00082] Thus, in this aspect, a location of a part of each hand of the surgeon 181 is tracked.
[00083] Another system control parameter of the minimally invasive surgical system 100, that is, the pair of control point positions, based on the tracked location is generated by the hand tracking controller 130. The hand tracking controller 130 sends the pair of checkpoint positions with a camera control system event to system controller 140.
[00084] In response to the camera control system event, system controller 140 generates a camera control system command based on the pair of control point positions. The camera control system command is sent to a teleoperated endoscopic camera handler in the minimally invasive surgical system 100. Therefore, the minimally invasive surgical system 100 uses the pair of checkpoint positions to control the operation of the handler. teleoperated endoscopic camera of the minimally invasive surgical system 100. System Control through Manual Gesture Positions and Manual Gesture Trajectories
[00085] In this regard, after being placed in an operation gesture detection mode, the hand tracking controller 130 detects a manual gesture position, or a manual gesture position and a manual gesture path. Controller 130 maps the manual gesture positions on certain system mode control commands and, similarly, controller 130 maps the manual gesture paths on other system mode control commands. Note that positioning and trajectory mapping is independent, so this is different, for example, from manual sign language tracking. The ability to generate system commands and the control system 100 that uses manual gesture positions and manual gesture trajectories, instead of manipulation switches, numerous pedals, etc., as in known minimally invasive surgical systems, provides greater ease of use of the 100 system for the surgeon.
[00086] When a surgeon is standing, the use of manual gesture positions and manual gesture trajectories in the control system 100 makes it unnecessary for the surgeon to take his eyes off the patient and / or watch the screen and look for a pedal or switch when the surgeon wants to change the system mode. Finally, the elimination of the various switches and pedals reduces the occupied area required by the minimally invasive teleoperated surgical system.
[00087] The particular set of manual gesture positions and manual gesture paths used in the minimally invasive surgical control system 100 is not critical, as long as each manual gesture position and each manual gesture path is unambiguous. Specifically, a hand gesture position should not be able to be interpreted by the hand tracking controller 130 as one or more other hand gesture positions in the set of positions, and a hand gesture path should not be interpreted as more than a trajectory of manual gesture in the set of trajectories. Thus, the manual gesture positions and manual gesture paths discussed below are illustrative only and are not intended to be limiting.
[00088] Figures 3A to 3D are examples of manual gesture positions 300A to 300D, respectively.
[00089] Figures 4A to 4C are examples of trajectories of manual management. Note, for example, that the configuration in figure 2A looks similar to that in figure 3A, however, the mode of operation of the minimally invasive surgical system 100 is different when the two configurations are used.
[00090] In figure 2A, the minimally invasive teleoperated auxiliary surgical instrument is coupled to the main finger tracking retainer 170 and the system 100 is in the following mode, so that the movement of the minimally invasive teleoperated auxiliary surgical instrument follows the tracked movement. of the surgeon's hand. In figures 3A to 3D and 4A to 4C, the surgeon puts the system 100 into gesture recognition mode and then performs one of the hand gesture positions or illustrated hand gesture paths. Manual gesture positions and manual gesture trajectories are used to control system modes and are not used in the following operating mode. For example, system modes controlled with manual gesture positions serve to enable, disable, and cycle between visual displays, to clutch a visual display, and to trace / erase remote medicine.
[00091] In the hand gesture position 300A (figure 3A), the thumb 292A and the index finger 292 are separated beyond a main clutch limit, for example, the spacing between the two digits of the hand 291R is greater than 115 mm . The hand gesture position 300B (figure 3B) with the index finger 292B extended and the thumb 292A bent is used to signal hand tracking controller 130 that the surgeon is tracking a manual gesture path (see figures 4A and 4B) . The manual gesture position 300C (figure 3C) with the thumb 292A up and the index finger 292B bent is used to connect a user interface and to cycle between modes in the user interface. The 300D manual gesture position (3D figure) with the thumb 292A down and the index finger 292B bent is used to turn off the user interface. Other manual gesture positions may include an "ok" manual gesture position, an L-shaped manual gesture position, etc.
[00092] Hand tracking controller 130, in one aspect, uses a multidimensional feature vector to recognize and identify a hand gesture position. Initially, a plurality of manual gesture positions are specified. The following is a set of resources that includes a plurality of resources and is specified. The feature set is designed to uniquely identify each hand gesture position in the plurality of poses.
[00093] A manual gesture position recognition process is trained using a training database. The training database includes a plurality of instances of each of the manual gesture positions. The plurality of instances includes resource vectors for positions made by countless different people. A set of resources is generated for each of the instances in the training database. These sets of resources are used to train a multidimensional Bayesian classifier, as explained more fully below.
[00094] When surgeon 180 wants to enter the operation's manual gesture mode, the surgeon activates a switch, for example, presses a pedal and then performs a manual gesture position with at least one hand. Note that although this example requires a single pedal, it allows the elimination of the other pedals in the foot tray of the known minimally invasive surgical system and, therefore, still have the advantages described above. The hand tracking unit 186 sends signals representing the positions and orientations captured from the thumb and index finger of the surgeon's hand or hands to the hand tracking controller 130.
[00095] Using the tracking data for the surgeon's hand digits, the hand tracking controller 130 generates an observed feature set. The hand tracking controller 130 then uses the trained multidimensional Bayesian classifier and a Mahalanobis distance to determine the possibility, that is, probability, that the observed feature set is a feature set of a plurality of hand gesture position of poses. This is done for each of the manual gesture positions in the plurality of poses.
[00096] The hand gesture position in the plurality of poses that is selected by the hand tracking controller 130 as the observed hand gesture position is the one that has the shortest distance from Mahalanobis if the distance from Mahalanobis is less than the distance from Mahalanobis maximum in the training database for that manual gesture position. The selected manual gesture position is mapped to a system event. Hand tracking controller 130 injects the system event into system controller 140.
[00097] System controller 140 processes the system event and issues a system command.
[00098] For example, if the manual gesture position 300C (figure 3C) is detected, system controller 140 sends a system command to display controller 150 to turn on the user interface. In response, the display controller 150 runs at least a portion of user interface module 155 on processor 151 to generate a user interface on the screen of the surgeon console 185.
[00099] Thus, in this respect, the minimally invasive surgical system 100 tracks a location of part of a human hand. Based on the tracked location, a system control parameter is generated, for example, a manual gesture position is selected. The manual gesture position is used to control the user interface of the minimally invasive surgical system 100, for example, to display the user interface on the screen of the surgeon's console 185.
[000100] User interface control is illustrative only and is not intended to be limiting. A manual gesture can be used to perform any of the mode changes in a known minimally invasive surgical system, for example, main clutch, camera control, camera focus, handler arm change, etc.
[000101] If the manual gesture position recognition process determines that the observed manual gesture position is the manual gesture position for a manual gesture path, a system event is not injected by the hand tracking controller 130 based in position recognition. Instead, a manual gesture trajectory recognition process is initiated.
[000102] In this example, the manual gesture position 300B (figure 3B) is the position used to generate a manual gesture path. Figures 4A and 4B are two-dimensional examples of manual gesture paths 400A and 400B that are performed using the manual gesture position 300B. Figure 4C shows other two-dimensional examples of manual gesture trajectories that can be used.
[000103] In one aspect, the manual gesture trajectory recognition process uses a Markov A Hidden Model A. To generate the probability distributions for the Markov A Hidden Model, a training database is required. Before obtaining the training database, a set of manual gesture trajectories is specified. In one aspect, the sixteen trajectories of manual gesture in figure 4C are selected.
[000104] In one aspect, numerous test objects are selected to carry out each of the trajectories of manual gesture. In one example, each test object performed each trajectory a predetermined number of times. The position and orientation data for each object for each trajectory performed were saved in the training database. In one aspect, as explained more fully below, the training database was used to train a discrete left-right Markov Hidden Model using an iterative Baum-Welch method.
[000105] When a surgeon performs a trajectory, the data is converted into an observation sequence O by the hand tracking controller 130. With observation sequence O and the Markov Hidden Model A, the hand tracking controller 130 determines which trajectory of manual gesture corresponds to the observed symbol sequence. In one aspect, the hand tracking controller 130 uses the forward recursion algorithm with the Markov Hidden Model A to generate the total probability of the observed symbol sequence. The manual gesture path with the highest probability is selected if this probability is greater than a limit probability. If the highest probability is less than the limit probability, no manual gesture path is selected, and processing ends.
[000106] The selected manual gesture path is mapped to a system event. Hand tracking controller 130 injects the system event into system controller 140.
[000107] System controller 140 processes the system event and issues a system command.
[000108] For example, if the selected manual gesture path is mapped to an event to change the lighting level at the surgical site, system controller 140 sends a system event to a controller and an illuminator to change the lighting level . Presence Detection through Hand Tracking
[000109] In one aspect, as indicated above, the hand positions of surgeons 291R, 291L (figure 6A) are tracked to determine whether teleoperation of the minimally invasive surgical system 100 is allowed and, in some respects, an interface should be displayed from user to surgeon. Again, hand tracking controller 130 tracks at least a portion of a surgeon's hand 180B (figure 6A). The hand tracking controller 130 generates a location for a main tool retainer, for example, main tool retainer 621 (figure 6B), which represents main tool retainers 621L, 621R (figure 6A), and a location for the hand. Hand tracking controller 130 maps the two locations tracked in a common coordinate frame and then determines the distance between the two locations tracked in the common coordinate frame. Distance is a system control parameter for a minimally invasive surgical system that is based on the tracked location of the surgeon's hand.
[000110] If the distance is less than a safe limit, that is, less than the maximum allowed separation between the hand part and the main tool retainer, the teleoperation of the minimally invasive surgical system 100 is allowed and, otherwise, teleoperation is inhibited. Similarly, in the aspect that uses presence detection to control the display of a user interface, if the distance is less than a safe limit, that is, less than the maximum allowed separation between the hand part and the retainer. main tool, the display of a user interface on a screen of the minimally invasive surgical system 100 is prevented and otherwise the display of the user interface is allowed.
[000111] In this way, the distance is used to control the teleoperation of the minimally invasive surgical system 100. Specifically, hand tracking controller 130 sends a system event to system controller 140 that indicates whether teleoperation is allowed . In response to the system event, system controller 140 configures system 100 to allow or prevent teleoperation. Hand Location Tracking Technologies
[000112] Before considering the various aspects of hand tracking described in further detail, an example of a tracking technology is described. This example is illustrative only and in view of the description below, any tracking technology that provides the necessary hand or finger location information can be used.
[000113] In one aspect, pulsed DC electromagnetic tracking is used with sensors mounted on two fingers of a hand, for example, on the thumb and index finger, as shown in figures 2A to 2D and figure 7. Each sensor measures six degrees of freedom and, in one aspect, is eight millimeters by two millimeters by one and a half millimeters (8 mm x 2 mm x 1.5 mm). The tracking system has a right-handed hemisphere working space of 0.8 m and a position capture resolution of 0.5 mm and 0.1 degrees. The refresh rate is 160 Hertz and has a pickup latency of four milliseconds. When integrated into a system, additional latency can be incurred due to additional filtering and communication. The effective command latency of up to 30 milliseconds has been revealed to be acceptable.
[000114] In this regard, a tracking system includes an electromagnetic hand tracking controller, sensors for use in the main finger tracking retainer, and a hand tracking transmitter. A tracking system suitable for use in one embodiment of this invention is available from Ascension Technology Corporation of Burlington, Vermont, USA as a 3D guidance trakSTAR ™ System with a Medium Range Transmitter. (trakSTAR ™ is a registered trademark of Ascension Technology Corporation). The transmitter generates pulsed DC magnetic fields for high-precision tracking over medium ranges, which are specified as 78 centimeters (31 inches). This system provides dynamic tracking with 240 to 420 updates / second for each sensor. The outputs of the reduced passive sensors are not affected by power line noise sources. A clean line of sight between the transmitter and the sensors is not required. There is attitude tracking and no inertial deviation or optical interference. There is high metal immunity and no distortion of non-magnetic metals.
[000115] Although an electromagnetic tracking system with finger covers is used in this document, it is only illustrative and is not intended to be limiting. For example, a pen-type device can be held by the surgeon. The pen-type device is a finger part with three or more non-collinating fiducial markers on the external surface of the device. Typically, to make at least three fiducial markers visible from any point of view, more fiducial markers are used due to self-occlusion. The fiducial markers are sufficient to determine the movement of six degrees of freedom (three of translation and three of rotation) of the finger part and, thus, of the hand that holds the pen-type device. The pen-like device also captures grip in one aspect.
[000116] The pen-like device is viewed by two or more cameras of known parameters to locate the fiducial and three-dimensional markers and to infer the three-dimensional position of the part of the finger. Fiducial markers can be implemented, for example, as 1) retroreflective spheres with lighting close to the camera; 2) half concave or convex spheres with lighting close to the camera; or 3) active markers, such as an LED (intermittent). In one aspect, infrared illumination near the part of the finger is used, and filters are used to block the visible spectrum on the camera to minimize distraction from background interference.
[000117] In another aspect, a data glove 501 (figure 5) or bare hand 502 is used, and fiducial markers 511 are attached to the thumb and index finger of the glove 501 (and / or to other glove digits) that the surgeon will use and / or directly on the skin of the 502 hand. Again, redundant markers can be used to accommodate self-occlusion. Fiducial markers can also be placed on other fingers to allow for more user interface features through specifically defined manual gestures.
[000118] The three-dimensional tracked locations of the fiducial markers are computed by triangulation of multiple cameras that have a common field of view. The three-dimensional tracked locations of the fiducial markers are used to infer the three-dimensional position (translation and orientation) of the hand and also the size of the retainer.
[000119] Tracked marker locations need to be calibrated before use. For example, the surgeon can show the hand with markers in different positions for the camera. The different positions are then used for calibration.
[000120] In yet another aspect, hand tracking without a marker is used. The articulated hand movement can be tracked using images viewed from one or more cameras and processing these images using computer software. Running computer software does not need to track every degree of freedom of the hand to be useful. The execution software only needs to trace the part related to the two fingers of a hand to be useful for controlling a surgical tool, as demonstrated in this document.
[000121] In camera-based tracking, the accuracy of measurements depends on the accuracy of location of the markers in the image; the precision of three-dimensional reconstruction due to the geometry of the camera; and redundant data, such as, more than a minimum number, for example, three, of fiducial markers, more than a minimum number (one or two) of cameras, and average and temporal filtering.
[000122] The precision of three-dimensional reconstruction counts significantly with the accuracy of the camera calibration. Some fiducial markers attached to the known tracked locations on the surgeon's console can be used to determine the extrinsic parameters (rotation and translation) of multiple cameras relative to the surgeon's console. This process can be carried out automatically. The active fiducial markers can be used for the calibration fiducial markers, since such markers are only switched on during a calibration process and before the procedure. During the procedure, the calibration fiducial markers are turned off to avoid confusion with the fiducial markers used to locate the surgeon's hand. The relative extrinsic parameters can also be computed by observing a moving marker in the common field of view of the cameras.
[000123] Other tracking technologies that are suitable for use include, but are not limited to, inertial tracking, depth camera tracking and fiber bending capture.
[000124] As used in this document, a sensor element, sometimes called a tracking sensor, can be a sensor for any of the hand tracking technologies described above, for example, a passive electromagnetic sensor, a fiducial marker or a sensor for any of the technologies. Coordinate Systems
[000125] Before considering the various processes described above in further detail, an example of a 185B surgeon console (figure 6A) is considered, and several coordinate systems are defined for use in the following examples. The 185B surgeon console is an example of the 185B surgeon console. The 185B surgeon console includes a 610 three-dimensional viewer, sometimes referred to as a 610 viewer, 620L, 620R main tool manipulators with 621L, 621R main tool retainers, and a base 630. Main tool retainer 621 (figure 6B) is a more detailed diagram of main tool retainers 621L, 621R.
[000126] The main tool retainers 621L, 621R of the main tool manipulators 620L, 620R are held by the surgeon 180B using the index finger and the thumb, so that the target and capture involve intuitive pointing and pitching movements. Main tool manipulators 620L, 620R in combination with main tool retainers 621L, 621R are used to control teleoperated auxiliary surgical instruments, teleoperated endoscopes, etc. in the same way as the main tool handlers known in a known minimally invasive teleoperated surgical system. Also, the position coordinates of main tool manipulators 620L, 620R and main tool retainers 621L, 621R are known from the kinematics used in the control of auxiliary surgical instruments.
[000127] In normal operation view mode, viewer 610 displays three-dimensional images of the surgical site 103 from stereoscopic endoscope 112. Viewer 610 is positioned on console 185B (figure 6B) close to the surgeon's hand that the image of the surgical site seen in viewer 610 is oriented, so that the surgeon 180B feels that he is actually looking directly down on the surgical site 103. The surgical instruments in the image appear to be substantially located where the surgeon's hands are located and substantially targeted as the 180B surgeon can expect based on the position of his hands. However, the 180B surgeon cannot see your hands, nor the position or orientation of the 621L, 621R main tool retainers, while viewing the displayed image of the surgical site on the 610 viewer.
[000128] In one aspect, the main tool handlers 620L, 620R are moved directly in front of the surgeon 180B and under the viewer 610, so that they are positioned through the base 630 and, so that they are no longer positioned under the the viewer 610, that is, the main tool manipulators are parked out of the way of the hand gesture. This provides an unobstructed volume under the viewer 610 in which the surgeon 180B can perform manual gestures, each or both of the manual gesture positions or manual gesture trajectories.
[000129] In the aspect of figure 6A, three coordinate systems are defined in relation to the 185B surgeon's console: a 660 display coordinate system, a universal coordinate system 670 and a tracker coordinate system 650. Note that the coordinate systems equivalent coordinates are defined for surgeon 181 (figure 1), so that the mapping described more fully below can be performed to track data from main finger tracking retainer 170 or main tool retainers 621L, 621R. See, for example, U.S. patent application number 12 / 617,937 (filed November 13, 2009, which describes "Patient-Side Surgeon Interface For a Minimally Invasive Teleoperated Surgical Instrument"), which was previously incorporated by reference.
[000130] In the 660 display coordinate system, the surgeon 180B is looking down the Zview Z axis. The Y Yview geometric axis points upwards on the screen. The X Xview geometric axis points to the left on the screen. In the universal coordinate system 670, the Z Zworld geometry axis is a vertical geometry axis. The universal axis X Xworld and universal axis Y Yworld are in a plane perpendicular to the Z axis Zworld.
[000131] Figure 6B is a more detailed illustration of main tool retainer 621 and main tool manipulators 620. Coordinate systems 680, 690 are discussed more fully below in relation to method 1100 of figure 11. Surgical Instrument Control Process through Hand Tracking
[000132] Figure 7 is an illustration of sensor 212 mounted on the index finger 292B with a location 713 in the tracking coordinate system 750, and a sensor 211 mounted in the thumb 292A with a location 711 in the tracking coordinate system 750. Sensors 211 and 212 are part of the electromagnetic tracking system described above. The thumb 292A and index finger 292B are examples of digits in the right hand 291R. As previously noted, a part of a human hand includes at least one digit of the hand. As known to those versed in the field, the fingers, sometimes called digits or phalanges, of the hand are the thumb (first finger), the index finger (second finger; index finger), middle finger (third digit), ring finger (fourth digit), and little finger (fifth digit).
[000133] In this document, the thumb and index finger are used as two-digit examples of a human hand. This is only illustrative and is not intended to be limiting. For example, the thumb and the middle finger can be used in place of the thumb and index finger. The description in this document is also directly applicable to the use of the middle finger. Also, the use of the right hand is only illustrative. When similar sensors are used in the left hand, the description in this document is directly applicable to the left hand as well.
[000134] A cable 741, 742 connects sensors 211, 212 from main finger tracking retainer 270 to hand tracking controller 130. In one aspect, cable 741, 742 carries the position and orientation information of sensors 211, 212 for hand tracking controller 130.
[000135] The use of a cable to transmit the captured position and orientation data to the hand tracking controller 130 is illustrative only and is not intended to be limiting to this specific aspect. In view of this description, someone skilled in the field can select a mechanism to transmit the position and orientation data captured from the main finger tracking retainer or main finger tracking retainers to the hand tracking controller 130 (e.g., through the use of wireless connection).
[000136] The cable 741, 742 does not impede the movement of the main finger tracking retainer 270. Since the main finger tracking retainer 270 is mechanically ungrounded, each main finger tracking retainer is effectively not limited to movements position and orientation within the accessible working area of the surgeon and the working area of the hand-track transmitter (for example, left-right, upward-downward, in-out, roll, pitch and yaw and a Cartesian coordinates).
[000137] In one aspect, as described above, each sensor 211, 212 in the main finger tracking retainer 270 captures three degrees of translation and three degrees of rotation, that is, six degrees of freedom. In this way, the data captured from the two sensors represents twelve degrees of freedom. In another aspect, each sensor 211, 212 in the main finger tracking retainer 270 captures three degrees of translation and two degrees of rotation (yaw and pitch), that is, five degrees of freedom. In this way, the data captured from the two sensors represents ten degrees of freedom.
[000138] The use of a checkpoint position and checkpoint guidance based on the tracked locations to control a teleoperated auxiliary surgical instrument requires six degrees of freedom (three of translation and three of orientation), as more fully Described below. Thus, in the aspects where each sensor has five or six degrees of freedom, sensors 211, 212 provide redundant degrees of freedom. As described above and more completely below, the redundant degrees of freedom are mapped into the parameters used to control aspects of the teleoperated auxiliary surgical instrument in addition to position and orientation.
[000139] Still in an additional aspect, each sensor 211, 212 captures only three translational degrees of freedom and, then, together represent six degrees of freedom. This is sufficient to control three degrees of translation, scrolling and retainer closure for an auxiliary surgical instrument that does not include a wrist mechanism. The following description is used to generate the control point location that uses the six degrees of freedom. The control point orientation is taken as the orientation of the auxiliary surgical instrument. The retainer closing parameter is determined, as described below, using the checkpoint location and checkpoint orientation. The swivel roll is determined, as described above, using the relative movement of the thumb and index finger.
[000140] In each aspect where the sensors capture six degrees of freedom, or where the sensors capture five degrees of freedom, the index finger sensor 212 generates a signal representing a Pindex index finger position and an index finger orientation ^ mdex in the 750 tracking coordinate frame. The 211 thumb sensor generates a signal representing a pole-D Pthumb thumb position and a thumb thumb ^ thumb in the 750 tracking coordinate frame. In one aspect, Pindex θ Pthumb positions are adopted aligned to the center of the user's nail on the index finger 292B and to the center of the user's nail on the thumb 292A, respectively.
[000141] In this example, the positions P'1 "1 ^ and Pthumb are represented as a three-by-one vector in the tracking coordinate frame 750. The Pindex and Pthumb positions are crawler coordinates. The“ index θ fttòunib Sg0 orientations represented as a three by three matrix in the 750 tracking coordinate frame, that is,

[000142] A control point position ^ cp is centered between the index finger 292B and the thumb 292A. The PCP control point position is found in control point frame 790, however, it is specified in the tracker coordinates. The Z axis of the control point frame 790 extends through the control point position ^ CP in the pointing direction, as more fully described below.
[000143] Also, as explained below, the index finger 292B and the thumb 292A are mapped in the claws of an auxiliary surgical instrument, however, the two digits are more right-handed than the instrument claws. The Y axis of the control point frame 790 corresponds to the pin used to close the instrument claw. Thus, the Y axis of the control point frame 790 is perpendicular to a vector between the index finger 292B and the thumb 292A, as described below.
[000144] The control point position is represented as a three by one vector in the track coordinates of the tracking coordinate frame 750. The control point orientation is represented as a three by three matrix in the track coordinates, ie ,

[000145] Figure 8 is a process flow diagram for mapping a location and part of a hand on a retainer closure parameter used to control the retainer of an auxiliary surgical instrument, for example, one of the teleoperated auxiliary surgical instruments on figure 1. This mapping also maps a temporal change in location in a new retainer closure parameter and a corresponding location of an auxiliary instrument tip and the speed of travel to that location.
[000146] Initially, upon entering process 800, the RECEIVE HAND LOCATION DATA 810 process receives the position and orientation of the index finger (Pindex Rfridex) θ the position and orientation of the thumb (Pthumb Rthumb) »that in this example they are stored as 811 data. The position and orientation of the finger indicates- and the position of orientation of the thumb (Pthumb Rthumb) is based on the data from the tracking system. Process 810 is transferred to the MAP LOCATION DATA TO CONTROL POINT AND GRIP PARAMETER 820 process.
[000147] The MAP LOCATION DATA TO CONTROL POINT AND GRIP PARAMETER 820 process generates a control point position PCP 'a control point orientation ^ CP, and a retaining close parameter ®SriP using the position and orientation of the finger indicator (Pindex R-mdex) θ the position and orientation of the finger (Pthumb »Rthumb.) - The Pep * control point position, the CP control point orientation p, and the retainer closing parameter g§riPsão stored as 821 data.
[000148] In one aspect, the control point mapping performed in process 820 is defined to emulate the fundamental control point positioning properties of known main tool handlers. Thus, the response to the movement of the thumb and index finger will be familiar and intuitive to users of the minimally invasive teleoperated surgical system known with a surgeon console similar to the 180B surgeon console (figure 6A).
[000149] Figure 9 is a more detailed process flow diagram of an aspect of the MAP LOCATION DATA TO CONTROL POINT AND GRIP PARAMETER 820 process. First, at 820, the MAP HAND POSITION DATA TO CONTROL POINT 910 process generates a location of the PLPa control point position from the position of the Pindex index finger and deposition of the thumb Pdnnnb ie Pcp = 0.5 * (P thumb P index)
[000150] The Pepθ control point position is the average of the Pindex index finger position θ of the Pthumb thumb position. the MAP HAND POSITION DATA TO CONTROL POINT 910 process transfers processing to the GENERATE CONTROL POINT ORIENTATION 920 process.
[000151] As indicated above, the geometric axis Z of the control point orientation is aligned in the pointing direction. In this aspect of the GENERATE CONTROL POINT ORIENTATION 920 process, the Rodriquez geometric axis / angle formula is used to define the Z half geometric pointing direction vector for the control point orientation as a half rotation between the direction vector pointing finger, ndex vector of thumb pointing direction. From the Rfliumb thumb orientation, the pole gar Lthumb finger pointing direction vector is

[000152] Similarly, from the index finger index Rex The index finger pointing vector index is:

[000153] The ω vector is a vector perpendicular to the index finger pointing direction vector index and the thumb thumb pointing vector z thumb. The ω vector is defined as the vector product of the pointing finger vector indicating ♦ pain index and the pointing finger vector of zthumb, that is,

[000154] The angle θ is the magnitude of the angle between the index finger pointing direction vector and the thumb thumb pointing direction vector. The angle θ is defined as,

[000155] With the geometric axis ω and the angle 0, the direction direction of the geometric axis Zhalf and:

[000156] Thus, process 910 generated the position position of the PCP control point and the initial part of process 920 generated the approximate pointing direction of the geometric Z axis in the control point frame 790. Could someone proceed to interpolate the vectors of index finger and thumb guidance to generate cpe cp control point unit xy vector axes in a similar manner and then reortogonalize them to produce a control point orientation matrix.
[000157] However, greater teleoperation skills can be obtained from the tracked locations of the fingers using the following mapping. This mapping uses the relative positions of the index and thumb to effectively rotate and rotate the control point as if you were manipulating a small gyroscope between your fingers. The remainder of the 920 process is performed as follows to generate a complete set of xcp, ycp, ycp and zcp orthonormal control point vector axes.

[000158] With these vectors, the RCp control point orientation is

[000159] Now with processes 910 and 920, process 820 map- or the positions and orientations of index and thumb (Pindex, Rindex), (Pthumb »Rthuinb) to the position and control point orientation (PCP ' The 820 process must also generate a feriP retainer closing parameter. Thus, the GENERATE CONTROL POINT ORIENTATION 920 process transfers the processing to the GENERATE GRIP CLOSURE PARAMETER 930 process.
[000160] In process 930, the retainer closure is determined by the distances of Pindex index finger position and thumb finger position Pthumba partjrjθ axis axis geometry defined by the PCP control point position and the geometry axis direction CP CP . THIS allows the retainer closure to be invariant to slip when the thumb and index finger are touching.
[000161] Thus, the position of the Pindex index finger and the apposition of the Pthumb thumb are mapped on the Z axis in frame 790. The Pindex_proj position θ the projection of the Pindex index finger on the Z axis of frame 790, and the position Pthumb_proj θ the projection of the position of the thumb Pthumbgθbpθ θ geometric axis Z of frame 790.

[000162] The P'ndex_proj position and the Pthumb_proj position are used to evaluate a dvab evaluation retainer closing distance ie

[000163] Here, the double parallel lines are the known representations of the Euclidean distance of two normals. The evaluation retainer closing distance dVai is connected by a maximum distance limit dmax and a minimum distance limit θmin- As illustrated in figure 7, a padded foam connector 210 between sensors 211, 212 forces the fingers to stay within a fixed separation, that is, between a maximum distance limit dmaxθ a minimum distance limit ^ min Additionally, a neutral distance from corresponds to the separation distance when the two fingers are just touching.
[000164] For a particular set of sensors and a connector, a maximum distance limit dmax, a minimum distance limit dmin and the neutral distance dg are empirically determined. In one respect, three different combinations of sensors and a connector are provided for small, medium and large hands. Each combination has its own maximum distance limit dmax, minimum distance limit dmin, and neutral distance since the length of connector 210 is different in each combination.
[000165] Process 930 compares the distance ^ val with the minimum distance limit dmin If the comparison reveals that “vulé smaller than the minimum distance limit dmin, the closing distance of retainer d is set to the minimum distance limit dmin . Otherwise, process 930 compares the distance with the maximum distance limit dmax. If the comparison reveals that the distance ^ 'u | is greater than the maximum distance limit ^ max, the retaining closing distance d is set to the maximum distance limit. Otherwise, the closing distance of retainer d is adjusted to the distance dvui.
[000166] The test performed on the distance to determine the closing distance of retainer d is summarized as:

[000167] Afterwards in process 930, the Ggrip retainer closing parameter is generated:

[000168] Thus, a closing distance of retainer d between the maximum distance limit ^ max and the distance dfl is mapped to a value between zero and one. A retainer closing distance d between the minimum distance limit dmin- and the distance from is mapped to a value between minus one and zero.
[000169] A value of one for the retainer closure parameter £ grip is obtained when the index finger 292B and thumb 292A are separated to the maximum point allowed by connector 210 (figure 2A). A value of zero for the ggnp retainer closure parameter is obtained when the tip of the index finger 292B and the tip of the thumb 292A are just touching (figure 2C). Values in the range between zero and one control the opening / closing of the actuator claws of an auxiliary surgical instrument. A value of minus one for the retainer closure parameter ^ gnp is obtained when the index finger 292B and the thumb 292A are touching and connector 210 is completely compressed between the index finger 292B and the thumb 292A (figure 2D) . Values in the range between zero and minus one control the grip force of the closed actuator jaws. Connector 210 provides a passive tactile indicator for claw closure.
[000170] This example of mapping the closing distance of retainer d to a value in one of the two ranges is only illustrative and is not intended to be limiting. The example is illustrative of retainer closure distance mapping d to a value in a first retainer closure parameter range ^ grip to control the opening / closing claws of an actuator of an auxiliary surgical instrument when the closing distance of retainer d is greater than the neutral distance of. Here, "opening / closing" means opening and closing the claws. The retainer closing distance d is mapped to a value in a second range of the retainer closing parameter ^ gr, p to control the grip force of the closed actuator claws when the retainer closing distance d is less than a neutral distance dθ
[000171] Thus, process 820 maps the position and orientation of the index finger (Pindex Rindex) and the position and orientation of the thumb ^ Pthumb, A'ihiinii) 'θm a position and orientation of the control point (pep, Rep) and the retainer closing parameter ggπP which is stored as data 821. Process 820 is transferred to the MAP TO WORLD COORDINATES 830 process (figure 8).
[000172] The MAP TO WORLD COORDINATES 830 process receives data 821, and maps the data 821 in a universal coordinate system. (See universal coordinate system 670) (figure 6A). Specifically, the position and orientation of the control point (Pcp, Rcp) and Q retainer closure parameter ^ srip are mapped to the position and orientation of the universal coordinate control point ÍPcp_wc, Rcp_wc) using a homogeneous transform rp xtc genea four by four that maps the coordinates in a tracking coordinate system 750 (figure 7B) to the coordinates in the universal coordinate system 670, for example,
where WCRtc maps an orientation in tracking coordinates tc to an orientation in universal coordinates wc, and wcttc transfers a position in the tracking coordinates t to a position in the universal coordinates wc.
[000173] The retainer closure parameter ^ grip is not changed by this mapping. The data in the universal coordinates are stored as data 831. Process 830 is transferred to the MAP TO EYE COORDINATES 840 process.
[000174] The MAP TO EYE COORDINATES 840 process receives data 831 in universal coordinates wce maps the data 831 in an eye coordinate system (See eye coordinate system 660) (figure 6A). Specifically, the position and orientation of the universal coordinate control point Pcp- "c- ^ cp_wc) and the retainer closure parameter ^ grip are mapped to the position and orientation of the eye coordinate control point (Pcp-ec ') Rcp-cc) using a homogeneous transform four eC rp by four wc that maps the coordinates in the universal coordinate system 670 (figure 6A) to the coordinates in the eye coordinate system 660, for example,
where eeRwc maps an orientation in the universal coordinates wcat is an orientation in the coordinates of the eye θC, θ eci wc is a translation of a position in the universal coordinates wc to a position in the coordinates of the eye W.
[000175] Again, the retainer closing parameter £ gr, p is not changed by the mapping. The data in the eye coordinates are stored as 841 data. The 840 process is transferred to the GENERATE VELOCITIES 850 process.
[000176] In process 800, the mapping processes 830 and 840 are described as two different processes just for ease of illustration. In one aspect, the mapping processes 830 and 840 are combined so that the control point data in the tc tracking coordinates are directly mapped to the data in the θ ^ eye using a homogeneous eC rp four by four tc transform that maps the coordinates in the tracking coordinate system 650 (figure 6A) to the coordinates in the eye coordinate system 660, for example,

[000177] In this respect, the position of the control point in the eye coordinates, Pcp_ec and
and the orientation of the control point in the coordinates of the Rcp cc eye is:

[000178] In some ways, the mapping of universal coordinates can be eliminated. In this case, the control point data is mapped directly from the tracking coordinate system within the eye coordinate system without using a universal coordinate system.
[000179] For teleoperation, position, orientation, and speed are necessary. Thus, the GENERATE VELOCITIES 850 process generates the necessary speeds. Speeds can be generated in several ways. Some implementations, such as inert and gyroscopic sensors, can directly measure the differential signals to produce a linear speed and an angular speed of the control point. If velocities cannot be directly measured, process 850 estimates velocities from measurements of location in the eye coordinate system in one aspect.
[000180] Velocities can be estimated using finite differences in the eye coordinate system over the sampling interval. For example, the linear speed ^ cp_ec θ estimated as:
and the angular velocity is estimated as:

[000181] In another aspect of the GENERATE VELOCITIES 850 process, the linear speed of the control point cp_tc and the angular speed of the control point are detected in the tracking coordinates of the tracking coordinate system 750 (figure 7). In this respect, the linear speed of the directly detected control point ^ cP-tcθ the angular speed of the control point θ ^ cp tc directly detected _ are rotated from the tracking coordinate system 750 to the eye coordinate system ecR 660 using a tc rotation. Specifically, using the rotation mappings as defined above,

[000182] The GENERATE VELOCITIES 850 process is transferred to the SEND CONTROL COMAND 860 process. The 860 process sends an appropriate system control command to the auxiliary surgical instrument based on the position, orientation, speeds, and the stored retainer closure parameter. as 851 data.
[000183] In one aspect, processes 810 to 850 are performed by hand tracking controller 130 (figure 1). Controller 130 runs finger tracking module 135 on processor 131 to perform processes 810 to 850. In this respect, finger tracking module 135 is stored in memory 132.
[000184] Process 850 sends a system event to system controller 140 which in turn performs process 860.
[000185] It will be assessed that hand tracking controller 130 and system controller 140 can be implemented in practice by any combination of hardware, software that runs on a processor, and firmware. Also, the functions of these controllers, as described here, can be performed by a unit, or divided between different components, each of which can be implemented successively by any combination of hardware, software that runs on a processor, and firmware. When divided between different components, the components can be centralized at one location or distributed throughout the system 100 for distributed processing purposes. Manual Gesture Position Control Process and Gesture Trajectory
[000186] Figure 10 is a process flow diagram of an aspect of a hand gesture position 1000 process and manual gesture path control of system 100. In one aspect, as described above, a position recognition process of manual gesture 1050 uses a multidimensional Bayesian classifier and a process of recognition of manual gesture 1060 uses a discrete Markov Hidden Model À.
[000187] As described above, figures 3A to 3D are examples of manual gesture positions. To train the 1050 hand gesture position recognition process, numerous hand gesture positions are specified. The number of hand gesture positions used is limited by the ability to define unique positions that can be unambiguously identified by the 1050 recognition process, and by the surgeon's ability to reliably remember and reproduce each different hand gesture position.
[000188] In addition to defining manual gesture positions, a set of resources that includes a plurality of resources where í varies from 1 to n, is defined. The number 0 number of resources used. The number and type of resources are selected so that each manual gesture in the set of allowable positions can be precisely identified. In one respect, the number n is six.
[000189] The following description is an example of a set of resources with n resources.

[000190] The feature is the scalar product of the index 292B • index finger pointing direction and the thumb 292A thumb pointing direction. Feature 2 is the distance between the index finger 292B and the thumb 292A. Feature 13 is the distance from the z • thumb 292A projected in the mdex pointing direction of the index finger 292B. Feature 14 is the distance from the thumb 292A from the geometric axis along the pointing direction Z • f * index of the fingerjnc | jCador 292B. Feature l5 is the Z component of the thumb pointing direction of the 292A thumb. The f X n feature is the scalar product of the normal thumb thumb vector of the thumb 292A and the pointing direction of the index finger 292B.
[000191] Before using the 1000 method, it is necessary to develop a database of hand gesture training. Countless different users produce at least one manual gesture each, and the position and orientation data of each manual gesture for each user is measured using the tracking system. For example, each person in a group of people makes an allowable manual gesture. The positions and orientations of the index and thumb of each person in the group in the training database.
[000192] When using the training database, a set of resources generated for each manual gesture of each user. The set of training resource vectors for each hand gesture is then used to compute, an average • and a covariance
[000193] Thus, the training database is used to obtain an average and resource vector covariance for each trained gesture. In addition, for each hand gesture position, a Mahalanobis d (fi) distance (See discussion below) is generated for each trainer and the maximum Mahalanobis d (fj) distance for cac | a p0Sj_ tion of manual gesture except as a limit for that hand gesture position.
[000194] A person can also use the Mahalanobis distance measure to verify that all trained gestures are sufficiently different and unambiguous for the given set of resources used. This can be done by testing the Mahalanobis distance of a given gesture resource vector average 1 and the resource vector average of all other allowable gestures. This test distance must be much greater than the maximum training distance used for that particular gesture.
[000195] As is known by the elements skilled in the art, a specification of a Hidden Markov Model requires the specification of two parameters model N, M and three measures of probability A, B, TE The Hidden Model of Markov is represented as: A = (A, B, π)
[000196] The model parameter N is the number of states in the model, and the model parameter M is the number of observation symbols per state. The three probability measures consist of state transition probability distribution A, observation gesture probability distribution B, and initial state distribution π.
[000197] In one respect, for a discrete Markov Hidden Model, the transition probability distribution A is an NX N matrix. The observation gesture probability distribution B is an NXM matrix, and π initial state distribution is an matrix NX 1.
[000198] Determination of an observation sequence O and the Hidden Model of Markov À, the probability of observation sequence O with the determined Hidden Model of Markov À, that is, P (θlA) θ evaluated in process 1000, as described more clearly below.
[000199] To generate the probability distributions for the Markov À Hidden Model, a training database is required. Before obtaining the training database, a set of manual gesture trajectories is specified.
[000200] Numerous test subjects are selected to create trajectories of manual gesture. While in figure 4C, the sixteen manual gesture trajectories are presented in a projected two-dimensional form, the test subjects are unlimited when the various manual gesture trajectories are performed, this allows the appearance of some three-dimensional variations. In one aspect, each individual performed the manual gesture trajectory K times, this produces J training sequences per manual gesture trajectory.
[000201] In one aspect, a discrete left-right Markov Hidden Model was used. The Markov Hidden Model À was selected so that the probability P (OIA) is locally maximized using an iterative Baum-Welch method. See, for example, Lawrence R. Rabiner, "A Tutorial on Hidden Markov Models and Selected Applications in Speech Recognition" Proceedings of the IEEE, Vol. 77, No. 2, pp. 257-286 (Feb. 1989), which is incorporated by reference here as a demonstration of knowledge of Markov's Hidden Models from those known in the models. In one respect, the iterative method was interrupted when the model converged within 0.1 percent during three successive iterations.
[000202] The initial state probability π has been adjusted so that the model always starts with the first state. The transition probability matrix A was initialized with random entries, which were classified in descending order on a row by row basis. To reinforce the structure from left to right, all entries in the lower diagonal transition probability matrix A have been set to zero. In addition, transitions greater than two states were rejected when setting inputs to zero where (i - j)> 2 for all rows * and columns J. The transition probability matrix A was normalized at the end on a row-by-row basis. .
[000203] The initialization of the observation probability matrix B divided the observation sequence evenly based on the desired number of states. Therefore, each state may initially observe one or more symbols with a probability based on a local frequency count. This matrix was also normalized on a row by row basis. See, for example, N. Liu, R.I.A. Davis, BC Lovell, PJ Kootsookos, "Effect of Initial HMM Choices in Multiple Sequence Training for Gesture Recognition" International Conference on Information Technology, 5-7, Las Vegas, pages 608-613 (April 2004), which is incorporated herein reference title as a demonstration of initialization procedures for Hidden Markov Models known to those skilled in the art. A Hidden Markov Model was developed for each trajectory of manual gesture.
[000204] Returning to method 1000, the GESTURE MODE ENABLED 1001 verification process determines whether the surgeon has enabled the system 100 gesture recognition operation mode. In one aspect, to enable the gesture recognition mode, the surgeon presses a pedal on the surgeon's console 185 (figure 1 A). If the gesture recognition mode is enabled, the verification process 1001 is transferred to the RECEIVE HAND LOCATION DATA 1010 process, and otherwise returns via RETURN 1002.
[000205] The RECEIVE HAND LOCATION DATA 1010 process receives the position and orientation of the index finger (P’ndex ’^> ndex) epOSjection and orientation of the thumb (Pthumb R-thumb) for the dose being performed by the surgeon. As noted above, the position and orientation of the indicated finger / Pthumb ’^ thumb) are based on data from the tracking system. Case 1010 is transferred to case GENERATE FEATURES 1011.
[000206] In the GENERATE FEATURES 1011 process, the position and orientation of the index finger (P, ndex 'R'ndex) θ ap0Sjection and thumb orientation (Pthumb, Rthumb) used to generate each ff f of the resources at - ° in one observed resource vector1! -0 The GENERATE FEATURES 1011 process is transferred to the COMPARE FEATURE WITH KNOWN POSES 1012 process.
[000207] The COMPARE FEATURE WITH KNOWN POSES 1012 process compares the observed resource vector with the set of trained resources for each position. This process determines the probability that the observed resource vector is included in an If) L 1J training database resource set for a particular manual gesture position, that is, it corresponds to the training data set. This can be expressed as P (fi_oIΩ) where the training database resource set belongs to object class Ω.
[000208] In this example, the probability P (fi_oIΩ) is:
where N is the dimensionality of the resource vector, for example, 11 in the example above.
[000209] A statistic used to characterize this probability is the Mahalanobis distance, which is defined as:
Where
The Mahalanobis distance is known for the elements skilled in the art. See, for example, Moghadam, Baback and Pentland, Alex, "Probabilistic Visual Learning Object Representation" IEEE Transactions On Pattern Analysis and Machine Intelligence, Vol. 19, No. 7, pp. 696 to 710 (July 1997), which is incorporated by reference.
[000210] Using eigenvectors <P and eigenvalues À of covariance,
is used in a diagonalized form so that the Mahalanobis distance d (fi_o) is:
Where
The diagonalized shape allows the Mahalanobis distance d (fi_o) to be expressed in terms of the sum:

[000211] In this example, this is the expression that is evaluated to determine the distance of Mahalanobis d (fi_o) therefore, process 1011 generates a distance of Mahalanobis Lo / Upon completion, process 1012 is transferred to process SELECT POSE 1013 .
[000212] In the SELECT POSE 1013 process, the manual gesture position that has the shortest Mahalanobis distance d (fi_θ) θ selected if the Mahalanobis distance ^ C ^ Lo) is less than the maximum Mahalanobis distance in the database training for that hand gesture position. If the Mahalanobis d (fi_o) distance is greater than the maximum Mahalanobis distance in the training database for that hand gesture position, no hand gesture position is selected. The SELECT POSE 1012 process is transferred to the TEMORAL FILTER 1014 process.
[000213] The TEMORAL FILTER 1014 process determines whether the result of process 1013 has provided the same result consecutively a predetermined number of times. If process 1013 provides the same result the predetermined number of times, process TEMPORAL FILTER 1014 is transferred to the verification process GESTURE POSE 1015, and returns in another way. The predetermined number of times is selected so that the TEMPORAL FILTER 1014 process prevents oscillations or temporary detections when switching between manual gesture positions.
[000214] The GESTURE POSE 1015 verification process determines whether the selected manual gesture position is the manual gesture position used in a manual gesture path. If the selected manual gesture position is the manual gesture position used in a manual gesture path, the GESTURE POSE 1015 verification process transfers the processing to the GENERATE VELOCITY SEQUENCE 1020 process, and otherwise transfers the processing to the processing process. POSE CHANGE check 1016.
[000215] The POSE CHANGE 1016 verification process determines whether the manual gesture position has changed from the last pass using method 1000. If the selected manual gesture position is equal to the result of the filtered gesture position immediately preceding the time, the verification process POSE CHANGE 1016 returns via RETURN 1003, and is otherwise transferred to the MAP TO SYSTEM EVENT 1030 process.
[000216] The MAP TO SYSTEM EVENT 1030 process maps the selected manual gesture position to a system event, for example, the system event assigned to the manual gesture position is consulted. Upon discovery of the system event, the MAP TO SYSTEM EVENT 1030 process transfers the processing to the INJECT SYSTEM EVENT 1031 process.
[000217] In one aspect, the INJECT SYSTEM EVENT 1031 process sends the system event to an event handler in the system controller 140 (figure 1). In response to the system event, system controller 140 sends an appropriate system command to the controllers and / or other device in system 100. For example, if the manual gesture position is assigned to a linked user interface event, the system controller 140 sends a command to display controller 150 to turn on the user interface. Display controller 150 performs the part of user interface module 155 on processor 150 required to power up the user interface.
[000218] When the manual gesture position is the manual gesture position used to create a trajectory, processing in method 1000 is transferred from the GESTURE POSE 1015 verification process to the GENERATE VELOCITY SEQUENCE 1020 process. In one aspect, the main feature used for manual gesture trajectory recognition is a unit velocity vector. The unit velocity vector is invariant to the start position of the gesture. In addition, a normalized velocity vector has variations in the size or speed of the gesture. Thus, in process 1020, the control point samples are converted into a normalized control point speed sequence, that is, into a unit speed vector sequence.

[000219] Upon completion of the GENERATE VELOCITY SEQUENCE 1020 process, the 1020 process transfers the processing to the CONVERT VELOCITY SEQUENCE INTO SYMBOL SEQUENCE 1021 process. As noted above, the Markov À Hidden Model requires a sequence of distinct symbols as input. In process 1021, distinct symbols are generated from the normalized control point velocity sequence through vector quantization.
[000220] In one aspect, vector quantization was performed using a modified K-averaging cluster with the condition that the process stops when the cluster assignments stop changing. While grouping by K-means is used, the process leverages the fact that resources are vectors of unity. In this case, the vectors, which are similar in direction, are grouped. This is done using the scalar product between each unit resource vector and the centralized clustering vectors as the similarity metric.
[000221] The grouping is initialized with random vector assignments to thirty-two groups and the total process is repeated several times, where the best grouping result is selected based on a maximum total "within" the grouping cost metric. It is noted that in this case, the "internal" group cost is based on a similarity measure. Each resulting group is assigned a unique index, which serves as the symbol for the Markov Hidden Model. An input vector is then mapped to its nearest group mean and the corresponding index for that group is used as the symbol. In this way, a sequence of unit velocity vectors can be translated into a sequence of indices or symbols.
[000222] In one aspect, the grouped vectors were assigned with a symbol based on a fixed eight-way two-dimensional vector quantization codebook. Thus, the 1020 process generates an observed symbol sequence and is transferred to the GENERATE GESTURE PROBABILITY 1023 process.
[000223] In one aspect, to determine which gesture corresponds to the observed symbol sequence, the GENERATE GESTURE PROBABILITY 1023 process uses the forward recursion algorithm with the Markov Hidden Model to find a probability that each gesture corresponds to the observed symbol sequence . The forward recursion algorithm is described in Rainer, "A Tutorial on Hidden Markov Models and Selected Applications in Speech Recognition" which was previously incorporated here for reference. Upon completion of the GENERATE GESTURE PROBABILITY 1023 process, processing is transferred to the SELECT TRAJECTORY 1024 process.
[000224] In the SELECT TRAJECTORY 1024 process, the manual gesture trajectory with the highest probability among the Markov Hidden Model trajectory gesture models is permissible. This probability must also be greater than a certain threshold to be accepted. If the highest probability is not greater than the limit, no manual gesture trajectory is selected. This limit should be adjusted to maximize recognition accuracy while avoiding false recognition.
[000225] Upon completion, the SELECT TRAJECTORY 1024 process transfers the processing to the TRAJECTORY FOUND 1025 verification process. If the SELECT TRAJECTORY 1024 process selects a manual management path, the TRAJECTORY FOUND 1025 verification process transfers the processing to the MAP TO SYSTEM EVENT 1030 process, and otherwise returns via RETURN 1004.
[000226] The MAP TO SYSTEM EVENT 1030 process maps the selected manual gesture path to a system event, for example, the system event assigned to the manual gesture path is consulted. Upon discovery of the system event, the MAP TO SYSTEM EVENT 1030 process transfers the processing to the INJECT SYSTEM EVENT 1031 process.
[000227] In one aspect, the INJECT SYSTEM EVENT 1031 process sends the system event to the event handler on the controller system 140 (figure 1). In response to the system event, system controller 140 sends an appropriate system command to the appropriate controller (s) or device (s). For example, if the system event is assigned to an action on the user interface, system controller 140 sends a command to the display controller 150 to perform the action on the user interface, for example, changing the location view mode surgical. Presence Detection Process
[000228] In yet another aspect, as described above, the tracked position of at least part of the 180B surgeon's hand is used to determine whether the hand is present in a final 621 main manipulator actuator. Figure 11 is a diagram of process flow of an aspect of a presence detection process 1100 performed, in one aspect, by hand tracking controller 130 in system 100. Process 1100 is performed separately for each of the surgeon's hands in an aspect.
[000229] In the GET JOINT ANGLES 1110 process, the joint angles of the main tool manipulator 620 (figure 6B) are measured. The GET JOINT ANGLES 1110 process transfers processing to the GENERATE FORWARD KINEMATICS 1111 process.
[000230] Since the lengths of the various connections in the main tool manipulator 620 are known and the base position 629 of the main tool manipulator 620 is known, the geometric relationships are used to generate the location of the main tool retainer 621 in the main work area coordinate system 680. Thus, the GENERATE FORWARD KINEMATICS 1111 process generates the Pmtm position q0 main tool retθntor 621 in the main work area coordinate system 680 using process angles 1110. The GENERATE FORWARD KINEMATICS process 1111 transfers the processing to the MAP TO WORLD COORDINATES 1112 process.
[000231] The MAP TO WORLD COORDINATES 1112 process maps the Pmtm position in the main work area coordinate system 680 to a Pmtm_wc position in the universal coordinate system 670 (figure 6A). Specifically,
where, wcTws is a rigid homogeneous four-by-four transformation, which maps the coordinates in the 680 desktop coordinate system to the coordinates in the universal 670 coordinate system.
[000232] Upon completion, the MAP TO WORLD COORDINATES 1112 process transfers the processing to the GENERATE HAND TO END EFECTOR SEPARATION 1130 process.
[000233] Returning to the RECEIVE HAND LOCATION DATA 1120 process, the RECEIVE HAND LOCATION DATA 1120 process. The RECEIVE HAND LOCATION DATA 1120 process transfers the processing to the GENERATE HAND POSITION 1121 process.
[000234] The GENERATE HAND POSITION 1121 process maps the position and orientation of the index finger (P'nf | cx-> Rindex) and apposition of the thumb orientation (Pthumb Rthumb) afθ a positioning and control point orientation in the control system tracking coordinates as described above and that description is incorporated here for reference. The ^ hand θ position is the position of the control point in the tracking coordinates. The GENERATE HAND POSITION 1121 process transfers processing to the MAP TO WORLD COORDINATES 1122 process.
[000235] The use of the control point position in presence detection is only illustrative and is not intended to be limiting. In view of this description, the presence detection could be done, for example, using the position of the tip of the index finger and using the position of the tip of the thumb, or using only one of these positions. The processes described below are equivalent to each of these positions associated with a part of a human hand.
[000236] The MAP TO WORLD COORDINATES 1122 process maps the Pfcand position in the tracking coordinates to a Phand_wc position in the 670 universal coordinate system (figure 6A).
[000237] Specifically,
where wcTws is a rigid homogeneous transformation four by four, which maps the coordinates in the tracking coordinate system 650 to the coordinates in the universal coordinate system 670.
[000238] Upon completion, the MAP TO WORLD COORDINATES 1122 process transfers the processing to the GENERATE HAND TO END ERECTOR SEPARATION 1130 process.
[000239] The GENERATE HAND TO END ERECTOR SE-PARATION 1130 process generates a separation distance Uscpent between the position Pπitm wc in the universal coordinate system 670 and the position Phand_wc in the universal coordinate system 670. In one aspect, the distance from Dsep separation is:

[000240] Upon completion, the GENERATE HAND TO END ERECTOR SEPARATION 1130 process transfers processing to the DISTANCE SARE 1131 verification process.
[000241] The DISTANCE SARE 1131 verification process compares the separation distance ^ seP with a safe distance limit. This limit should be small enough to be conservative while still allowing the surgeon to grasp or manipulate the most distal end of the final actuator. If the separation distance ^ seP is less than the safe distance limit, the DISTANCE SAFE 1131 verification process is transferred to the HAND PRESENCE ON 1140 process. Conversely, if the separation distance ^ seP is greater than the safe distance limit DISTANCE SAFE, the verification process 1131 is transferred to the HAND PRESENCE OFF 1150 process.
[000242] The HAND PRESENCE ON 1140 process determines whether the system 100 is in teleoperation. If system 100 is in teleoperation, no action is required and teleoperation is allowed to continue, and then process 1140 is transferred to initial process 1100. If system 100 is not in teleoperation, process HAND PRESENCE ON 1140 sends a hand presence event to the INJECT SYSTEM EVENT process which in turn sends the hand presence event to the system controller 140.
[000243] The HAND PRESENCE OFF 1150 process determines whether the system 100 is in teleoperation. If system 100 is not teleoperating, no action is required and then process 1150 is transferred to initial process 1100. If system 100 is teleoperating, the HAND PRESENCE OFF 1150 process sends a no-hand event to the process INJECT SYSTEM EVENT which in turn sends the hand presence event to the system controller 140.
[000244] System controller 140 determines whether the hand presence event or the hand absence event requires any change in the operating system mode and issues an appropriate command. In one aspect, the system controller 140 enables teleoperation in response to a hand presence event, for example, enables teleoperation, and disables teleoperation in response to a non-hand event if a teleoperated minimally invasive surgical system is attached to the main tool retainer. As known to those skilled in the art, a teleoperated minimally invasive surgical instrument is attachable and detachable from a main tool retainer.
[000245] In other respects, the hand presence event and the hand absence event are used by the controller system 140 in combination with other events to determine whether to allow teleoperation. For example, detecting the presence of the surgeon's head can be combined with detecting the presence of the surgeon's hand or hands when determining whether to allow teleoperation.
[000246] Similarly, as described above, the hand presence event and the hand absence event are used by the system controller 140 to control the display of a user interface in a display of the minimally invasive surgical system. When the system controller 140 receives the handless event, if the user interface is not powered on, the system controller 140 sends a command to the display controller 150 to turn on the user interface. Display controller 150 performs the part of user interface module 155 on processor 150 required to power up the user interface. When the system controller 140 receives the hand presence event, if the user interface is on, the system controller 140 sends a command to the display controller 150 to shut down the user interface. Display controller 150 performs the part of user interface module 155 in processor 150 necessary to shut down the user interface.
[000247] The hand presence event and the hand absence event can be used by the system controller 140 in combination with other events to determine whether to display the user interface. Thus, user interface display control and teleoperation control are examples of system mode control that use presence detection and are not intended to be limiting to these two specific system control modes.
[000248] For example, presence detection could be used to control a visual proxy like those described more fully below. Also, combinations of the various modes, for example, teleoperation and visual display proxy, could be controlled by the system controller 140 based on the hand presence event and the hand non-presence event.
[000249] Also, hand presence detection is useful in eliminating the dual function of main tool retainers 621L, 621R, for example, pushing a pedal and then using main tool retainers 621L, 621R to control a user interface which is displayed on the surgeon's console 185B. When the main tool retainers have a dual function, for example, used to control a surgical instrument and a user interface, the surgeon typically must push a pedal to switch to user interface operating mode. If for some reason, the surgeon is unable to push the pedal, but it is believed that the system has switched to user interface operating mode, the movement of the main tool retainer may result in the unwanted movement of the surgical instrument. The presence detection process 1100 is used to avoid this problem and eliminate the dual function of the main tool retainers.
[000250] Also, hand presence detection is useful for eliminating the dual function of main tool retainers 621L, 621R, for example, by pushing a pedal and then using main tool retainers 621L, 621R to control a user interface which is displayed on the 185B surgeon's console. When the main tool retainers have a dual function, for example, used to control a surgical instrument and a user interface, the surgeon typically must push a pedal to switch to user interface operating mode. If for some reason, the surgeon is unable to push the pedal, but it is believed that the system has switched to user interface operating mode, the movement of the main tool retainer may result in the unwanted movement of the surgical instrument. The presence detection process 1100 is used to avoid this problem and eliminate the dual function of the main tool retainers.
[000251] With the presence detection process 1100, in one example, when the handless event is received by system controller 140, system controller 140 sends a system command to lock the main tool handlers 6201 , 620R (figure 6A) in place, and sends a system command to the display controller 150 to present the user interface on the display device of the 185B surgeon's console. The movement of the surgeon's hand is tracked and is used to control elements in the user interface, for example, moving a slider, changing the display, etc. As noted above, the control point is mapped in the eye's coordinate frame and can then be associated with the location of an element in the user interface. The movement of the control point is used to manipulate that element. This is done without the surgeon having to activate a pedal and is done so that the surgeon cannot inadvertently move a surgical instrument. This eliminates the associated problems by using the main tool retainers to control the surgical instrument and the user interface.
[000252] In the example above, the universal coordinate frame is an example of a common coordinate frame. The use of the universal coordinate frame as the common coordinate frame is illustrative only and is not intended to be limiting. Primary Finger Tracking Retainer
[000253] Figure 12 is an illustration of an example of a 1270 primary finger tracking retainer.
[000254] The 1270 main finger tracking retainer is an example of the 170, 270 main finger tracking retainer.
[000255] The main finger tracking retainer 1270 includes a compressible body 1210 and two handles for fingers 1220, 1230. The compressible body 1210 has a first end 1213 and a second end 1214. A body section 1215 extends between the first end 1213 and second end 1214.
[000256] The compressible body 1210 has an external surface. The outer surface includes a first portion 1216 and a second portion 1217. The first portion 1216, for example, an upper portion, extends between the first end 1213 and the second end 1214. The second portion 1217, for example, a lower portion , extends between the first end 1213 and the second end 1214. The second portion 1217 is opposite and removed from the first portion 1216.
[000257] In one aspect, the outer surface is a surface of a fabric wrap. The fabric is suitable for use in an operating environment. The fabric wrap encloses the compressible foam. The foam is selected to provide resistance to compression, and expansion as the compression is released. In one aspect, several strips of foam have been included within the fabric wrap. The foam should also be able to be curved so that the first portion 1216 is positioned between the first and second fingers of a human hand as the tip of the first finger is moved towards the tip of the second finger.
[000258] Body section 1215 has a length L between the finger grip 1220 and the finger grip 1230.
[000259] As explained above, length L is selected to limit the separation between a first finger on handle 1220 and a second finger on handle 1230. (See figure 2A.)
[000260] In one aspect, body section 1215 has a thickness T. As illustrated in figure 2C, thickness T is selected just as when main finger tracking retainer 1270 is configured so that region 1236 in the second portion 1217 of the external surface adjacent to the end 1214 and the region 1226 in the second portion 1217 adjacent to the end 1213 are just touching, the second portion 1217 along the length L is not in complete contact with it.
[000261] The first finger strap 1220 is attached to the compressible body 1210 adjacent to the first end 1213. The handle 1220 extends around a region 1225 of the first portion 1216 of the external surface of the compressible body 1210. By placing the first finger grip 1220 on the first finger of the human hand, region 1225 contacts the first finger, for example, a first part of the first portion 1216 of the outer surface contacts the thumb.
[000262] In this example, the finger strap 1220 has two ends, a first end of fabric 1221A and a second end of fabric 122 IB. End 1221A and end 1221B are ends of a strip of fabric that is attached to body 1210. A piece of fabric from handle 1222B is attached to an inner surface of end 1221B, and a piece of loop fabric 1222A is attached to a outer surface of the end 1221 A. An example of loop fabric and loop fabric is a nylon fastening tape consisting of two strips of nylon fabric, one having small filaments tied and the other having a rough surface. The two strips form a strong bond when compressed together. An example of a commercially suitable fixing tape is the VELCRO® fixing fixture. (VELCRO® is a registered trademark of Velcro Industries B. V.)
[000263] The second finger strap 1230 is attached to the compressible body 1210 adjacent to the second end 1214. The handle 1230 extends around a region 1235 of the first portion 1216 of the outer surface of the compressible body 1210. By placing the second finger loop 1230 on the second finger of the human hand, region 1235 contacts the second finger, for example, a second part of the first portion 1216 of the outer surface contacts the index finger. The second part 1235 of the first portion is opposite and removed from the first part 1225 of the first portion.
[000264] In this example, the finger strap 1230 also has two ends, a first end of fabric 1231A and a second end of fabric 1231B. End 1231A and end 1231B are ends of a strip of fabric that is attached to body 1210. A piece of loop fabric 1232B is attached to an inner surface of end 1231B, and a piece of loop fabric 1232A is attached to a outer surface of the 1231 A end.
[000265] A first 1211 location tracking sensor is attached to the first 1220 finger loop. A second 1212 location tracking sensor is attached to the second 1230 finger loop. The location tracking sensors can be any of the sensor elements described above. In one example, location tracking sensors 1211, 1212 are passive electromagnetic sensors. Visual Proxy System
[000266] In one aspect, the hand tracking control system is used to control anyone among a plurality of visual proxies that can be used by one surgeon to monitor another surgeon. For example, when surgeon 180 (figure 1A) is being monitored by surgeon 181 using main finger tracking retainer 170, surgeon 181 uses primary finger tracking retainer 170 to control a visual proxy for a surgical instrument, while the surgeon 180 uses the main tool retainer to control a teleoperated auxiliary surgical instrument.
[000267] Alternatively, the surgeon 181 can superimpose images, or can control a virtual hand on the display device. Also, surgeon 181 can demonstrate how to manipulate the main tool retainer on the surgeon's console by manipulating a virtual image of main tool retainer 621 that is displayed on the display device. These visual proxy examples are illustrative only and are not intended to be limiting.
[000268] Furthermore, the use of main finger tracking retainer 170 while not in a surgeon's console is also illustrative and is not intended to be limiting. For example, with the presence detection system described above, a surgeon on a surgeon's console could move one hand from a main tool retainer, and then use that hand to monitor another surgeon as the hand is tracked. by hand tracking system.
[000269] To facilitate monitoring, a visual proxy module (not shown) is processed as part of a vision processing subsystem in one respect. In this regard, the execution module receives the orientation position of the control point from the supervisor's hand, and creates stereographic images, which are composed with real-time endoscopic camera images and displayed in any combination of the surgeon's console 185, the auxiliary display device, and the patient side surgeon interface display device 187.
[000270] When surgeon 181 starts monitoring when performing a predefined action, for example, a manual gesture position, a visual proxy system loop is activated, for example, the visual proxy module is executed in a processor module. The particular action, for example, position of the manual gesture, used as the predefined action is not essential as long as the system controller 140 (figure 1) is configured to identify that action.
[000271] In one aspect, the visual proxy is a virtual ghost instrument 1311 (figure 13) controlled by the main finger tracking retainer 170, while the teleoperated auxiliary surgical instrument 1310 is controlled by one of the main tool manipulators on the surgeon's console 185. Surgeon 181 looks at instruments 1310 and 1311 on the display device 187, while surgeon 180 looks at instrument 1310 and 1311 on the stereoscopic display device on the surgeon's console 185. The use of virtual ghost instrument 1311 as a visual proxy is illustrative only and is not intended to be limiting to that particular image. In view of this description, other images can be used for the visual proxy, these facilitate the differentiation between the image representing the visual proxy and the image of the real actuator of the teleoperated auxiliary surgical instrument.
[000272] The virtual ghost instrument 1311 appears similar to the real instrument 1310, except that the virtual ghost instrument 1311 is displayed in a way that clearly distinguishes the virtual ghost instrument 1311 from the real instrument 1310 (for example, a transparent or translucent spectral image, a distinctly colored image, etc.). The control and operation of the 1311 virtual phantom instrument are the same as those described above for a real teleoperated surgical instrument. Thus, the surgeon 181 can manipulate the virtual phantom instrument 1311 using the main finger tracking retainer 170 to demonstrate the proper use of the teleoperated auxiliary surgical instrument 1310. The surgeon 180 can simulate the movement of the virtual ghost instrument 1311 with the instrument 1310.
[000273] Virtual phantom instruments are more fully described in the commonly assigned Patent Application Publication No. US 2009/0192523 A1 (filed March 31, 2009; which describes "Synthetic Representation of a Surgical Instrument"), this is hereby incorporated by reference in its entirety. See also, Patent Application No. U.S. 12 / 485,503 (filed June 16, 2009, which describes "Virtural Measurement Tool for Minimally Invasive Surgery"); Patent Application No. U.S. 12 / 485,545 (filed June 16, 2009, which describes "Virtual Measurement Tool for Minimally Invasive Surgery"); Publication of Patent Application No. U.S. US 2009/0036902 A1 (filed August 11, 2008; which describes "Interactive User Interfaces for Robotic Minimally Invasive Surgical Systems"); Publication of Patent Application No. U.S. US 2007/0167702 A1 (filed December 30, 2005; which describes "Medical Robotic System Providing Three-Dimensional Telestration"); Publication of Patent Application No. U.S. US 2007/0156017 A1 (filed December 30, 2005; describing "Stereo Telestration for Robotic Surgery") and Patent Application Publication No. U.S. US 2010/0164950 A1 (filed May 13, 2009; which describes "Efficient 3-D Telestration for Local Robotic Proctoring"), each is hereby incorporated by reference in its entirety.
[000274] In another aspect, the visual proxy is a pair of virtual hands 1410, 1411 (figure 14) controlled by the main finger tracking retainer 170 and a second main finger tracking retainer, which is not visible in figure 1. Teleoperated auxiliary surgical instruments 1420, 1421 are controlled by the main tool manipulators on the surgeon console 185. Surgeon 181 looks at video image 1400 on display device 187, and surgeon 180 also looks at video image 1400 on stereoscopic display device on the surgeon's console 185. The virtual hands 1410, 1411 are displayed in a way that clearly distinguishes them from the other objects in the 1400 video image.
[000275] The opening and closing of the thumb and index finger of a virtual hand are controlled using the ggrip retainer closure parameter that was described above. The position and orientation of the virtual hand are controlled by the position and orientation of the control point, as described above, these are mapped in the eye coordinate space, also as described above.
[000276] Thus, as surgeon 181 moves his right hand in three dimensions, virtual hand 1411 follows the movement in video image 1400. Surgeon 181 can raise virtual hand 1411 to instruct surgeon 180 to lift the instrument helper-operated surgical 1421. Surgeon 181 can move virtual hand 1410 to a particular location and then use the movement of the thumb and index finger to instruct surgeon 180 to move the teleoperated auxiliary surgical instrument 1420 to that location and grab the tissue. When surgeon 180 grabs tissue with instrument 1420, surgeon 181 can use virtual hand 1410 to instruct surgeon 180 how to move the tissue. All of this takes place in real time and the virtual hands 1410, 1411 are superimposed on the stereoscopic endoscopic image. However, visual proxies can also be used in a monoscopic system.
[000277] In another aspect, the surgeon 181 changes the display mode using a position of the manual gesture so that the visual proxies are a virtual ghost instrument 1510 and a virtual remote medicine device 1511, which are shown in the video image 1500 (figure 15). The remote medical device 1511 is controlled by the main finger tracking retainer 170, while a second main finger tracking retainer, which is not visible in figure 1, controls the virtual phantom instrument 1511.
[000278] Teleoperative auxiliary surgical instruments 1520, 1521 are controlled by the main tool manipulators on the surgeon console 185. Surgeon 181 looks at video image 1500 on display device 187, and surgeon 180 also looks at video image 1500 in the stereoscopic display device on the surgeon's console 185. The virtual remote medical device 1511 and the virtual ghost instrument 1411 are displayed in a way that clearly distinguishes these from the other objects in the 1500 video image.
[000279] To superimpose the images with the virtual remote medicine device 1511, the surgeon 181 places the thumb and index finger as if holding an imaginary pen or pencil and then moves the right hand with the thumb and index finger in that position to overlay the images on the displayed video image. In video image 1500, surgeon 181 positioned the thumb and index finger and made a 1512 mark to illustrate the location of the tissue to be cut using the 1521 surgical instrument. After the 1512 mark was made, the 1810 surgeon separated the finger thumb and index finger and moved the virtual remote medicine device 1511 to the position shown in video image 1500.
[000280] The marking capability of the virtual remote medical device 1511 is controlled using the retainer closure parameter ^ §riPque that was described above. As noted above, when the thumb and index finger are just touching, the ®SriPé retainer close parameter is mapped to an initial value in a second range, and then when the retainer close parameter ^ SriP is in the second range , remote medicine is enabled by the remote medicine device 1511. The position and orientation of the control point after being mapped to the eye coordinate system is used to control the movement of the virtual remote medicine device 1511.
[000281] The above description and the accompanying drawings illustrating the aspects and modalities of the present invention should not be considered limiting - the claims define the protected inventions. Various mechanical, compositional, structural, electrical, and operational changes can be made without abandoning the spirit and scope of the description and claims. In some cases, well-known circuits, structures, and techniques have not been shown or described in detail to avoid obscuring the invention.
[000282] Furthermore, the terminology of that description is not intended to limit the invention. For example, spatially relative terms - such as "under", "below", "lower", "above", "upper", "proximal", "distal", and the like - can be used to describe a relationship of the element or the resource with another element or resource as illustrated in the figures. These spatially relative terms are intended to include different positions (that is, locations) and orientations (that is, rotational positions) of the device in use or operation in addition to the position and orientation shown in the figures. For example, if the device in the figures is reversed, the elements described as "below" or "under" other elements or resources could then be "above" or "over" the other elements or resources. Thus, the exemplificative term "below" can cover positions and guidelines as above and below. The device can be oriented in another way (rotated 90 degrees or in other orientations) and the spatially relative descriptors used here are interpreted accordingly. Also, motion descriptors along and around multiple axes include multiple special device positions and orientations.
[000283] The singular forms "one", "one", and "o" are intended to include the plural forms as well, except where the context indicates otherwise. The terms "comprises", "comprising", "includes", and the like specify the presence of specified resources, steps, operations, process elements, and / or components, but do not exclude the presence or addition and one or more other resources, steps, operations, process elements, components, and / or groups. The components described as coupled can be directly or electrically mechanically coupled, or they can be indirectly coupled through one or more intermediate components.
[000284] Memory refers to a volatile memory, a non-volatile memory, or any combination of the two. A processor is attached to a memory that contains instructions executed by the processor. This could be accomplished within a computer system, or alternatively through a connection to another computer through modems and analog lines, or digital interfaces and a digital carrier line.
[000285] Here, a computer program product includes a means configured to store computer-readable code necessary for any or any combination of the processes described in relation to hand tracking or in which computer-readable code for anyone or any combination of processes described in relation to hand tracking is stored. Some examples of computer program products are CD-ROM discs, DVD discs, flash memory, ROM cards, floppy disks, magnetic tapes, computer hard drives, servers on a network and signals transmitted over a network that represent the code for computer readable program. A non-temporary tangible computer program product includes a non-temporary tangible medium configured to store computer-readable instructions for anyone, or any combination of processes described in relation to multiple controllers or in which computer-readable instructions for anyone, or any combination of processes described in relation to the various controllers are stored. Non-temporary tangible computer program products are CD-ROM disks, DVD disks, flash memory, ROM cards, floppy disks, magnetic tapes, computer hard disks and other non-temporary physical storage media.
[000286] In view of this description, the instructions used in any one, or any combination of processes described in relation to hand tracking can be implemented in a wide range of computer system configurations using an operating system and a programming language. computer of interest to the user.
[000287] The use of different memories and processors in figure 1 is only illustrative and is not intended to be limiting. In some ways, a single hardware processor could be used and in other ways, multiple processors could be used.
[000288] Also, for each illustration, the various processes were distributed between a hand tracking controller and a system controller. This is also illustrative and is not intended to be limiting. The various processes can be distributed through controllers or consolidated into one controller without changing the principles of operation in the hand tracking process.
[000289] All examples and illustrative references are non-limiting and should not be construed as limiting the claims to the specific implementations and modalities described here and their equivalents. Titles are for formatting only and should not be used to limit the subject in any way, as text under a title can cross-reference or apply to text under one or more titles. Finally, in view of this description, the particular features described in relation to one aspect or modality can be applied to other described aspects or modalities of the invention, even if they are not specifically shown in the drawings or described in the text.
权利要求:
Claims (20)
[0001]
1. Method characterized by the fact that it comprises: receiving, in a controller (130, 140), a first location of a first sensor (1211) mounted on a first single part of a first human hand; receiving, by the controller (130, 140), a second location of a second sensor (1212) mounted on a second single part of a first human hand; generate; by the controller (130, 140), a position and orientation of a control point based on the first location received and the second location received; and control, by the controller (130, 140), teleoperation of a device in a minimally invasive surgical system based on the position and orientation of the control point.
[0002]
2. Method, according to claim 1, characterized by the fact that the device is a virtual proxy.
[0003]
3. Method, according to claim 1, characterized by the fact that the device comprises a teleoperated auxiliary surgical instrument; and where the method still comprises: generating, by the controller (130, 140), a retainer closing parameter based on the first and second locations, in which controlling the teleoperation still comprises controlling, by the controller (130, 140), the retainer of a final actuator of the teleoperated auxiliary surgical instrument based on the retainer closure parameter.
[0004]
4. Method, according to claim 3, characterized by the fact that the generation of the retainer closing parameter still comprises: mapping, by the controller (130, 140), a first position to a first position projected on a geometric axis of pointing out a control point frame based on the first location received; map, by the controller (130, 140), a second position to a second projected position on the geometric pointing axis of the control point frame based on the second location received; and generating, by the controller (130, 140), an evaluation retainer closing distance using the first position, the second position, the first projected position and the second projected position.
[0005]
5. Method, according to claim 4, characterized by the fact that the generation of the retainer closing parameter still comprises: adjusting, by the controller (130, 140), a retaining closing distance to a maximum limit distance when the evaluation retainer closing distance is greater than the maximum limit distance; adjust, by the controller (130, 140), the retainer closing distance to a minimum limit distance when the evaluation retainer closing distance is less than the minimum limit distance; and adjust, by the controller (130, 140), the retainer closing distance to the evaluation retainer closing distance when the evaluation retainer closing distance is greater than the minimum limit distance and less than the maximum limit distance .
[0006]
6. Method, according to claim 5, characterized by the fact that the generation of the retainer closing parameter still comprises: mapping, by the controller (130, 140), the retainer closing distance to a value in one first range of the retainer closing parameter to control opening / closing jaws of a final actuator of an auxiliary surgical instrument when the retainer closing distance is greater than a neutral distance; and map, by the controller (130, 140), the retainer closing distance to a value in a second range of the retainer closing parameter to control the closed claw grip force of the final actuator when the retainer closing distance is less than the neutral distance.
[0007]
7. Method, according to claim 1, characterized by the fact that it still comprises: receiving, by the controller (130, 140), a third location of a third sensor mounted on a single first part of a second human hand; receiving, by the controller (130, 140), a fourth location of a fourth sensor mounted on a second single part of the second human hand; and generating, by the controller (130, 140), a position and an orientation of a second control point based on the third location received and the fourth location received.
[0008]
8. Method, according to claim 7, characterized by the fact that both the control point and the second control point are used in the control teleoperation.
[0009]
9. Method, according to claim 1, characterized by the fact that the device is an endoscopic camera, and the teleoperation control still comprises controlling the teleoperation of an endoscopic camera manipulator.
[0010]
10. Method characterized by the fact that it comprises: receiving, by a hand tracking controller (130, 140), a first location of a first sensor (1211) mounted on a first single part of a human hand; receiving, by the hand tracking controller (130, 140), a second location of a second sensor (1212) mounted on a second single part of the human hand; determining, by the controller (130, 140), a relative movement between the first and second unique parts of the human hand based on the first and second locations received; and controlling, by the controller (130, 140), the orientation of a device in a minimally invasive surgical system based on the determined relative movement.
[0011]
11. Method, according to claim 10, characterized by the fact that: the determined relative movement is a first movement, and the orientation control also comprises controlling the yaw movement of an auxiliary surgical instrument pulse.
[0012]
12. Method, according to claim 11, characterized by the fact that: the determined relative movement is a second movement different from the first movement, and the control orientation comprises wrapping a tip of the auxiliary surgical instrument around the wrist in its direction pointing.
[0013]
13. Minimally invasive surgical system characterized by the fact that it comprises: a first sensor element (1211) configured to be mounted on a single first part of a human hand; a second sensor element (1212) configured to be mounted on a second single part of the human hand; a hand tracking system coupled to the first and second sensor elements (1211, 1212), where the hand tracking system is configured to track a first location of the first sensor element (1211) and a second location of the second sensor element (1212); and a controller (130, 140) coupled to the hand tracking system, where the controller (130, 140) is configured to: transform the first and second locations into a checkpoint position and a checkpoint orientation from a control point, and send a command to move a device from the minimally invasive surgical system based on the control point.
[0014]
14. System, according to claim 13, characterized by the fact that the controller (130,140) is further configured to: transform the first and second locations into a retainer closing parameter; and base the command on the seal closing parameter.
[0015]
15. System according to claim 13, characterized in that the first and second sensor elements (1211, 1212) comprise a plurality of fiducial markers.
[0016]
16. System according to claim 13, characterized in that the first and second sensor elements (1211, 1212) comprise a plurality of passive electromagnetic sensors.
[0017]
17. System according to claim 16, characterized by the fact that it still comprises: a main finger tracking device that includes the first and second sensor elements (1211, 1212), each one between the first and the second sensor elements (1211, 1212) comprise a passive electromagnetic sensor.
[0018]
18. System according to claim 13, characterized by the fact that it further comprises: a main finger tracking device (1270) which includes: the first sensor element (1211); the second sensor element (1212); a first finger strap (1220); a second finger strap (1230); and a compressible body (1210) comprising a first end and a second end opposite the first end, the first finger loop (1220) being attached adjacent the first end and the second finger loop (1230) being attached adjacent to the second extremity, by placing the first finger loop (1220) on a first finger (292A) of the human hand, the first sensor element (1211) is positioned adjacent to the first finger (292A) of the human hand, by placing the second finger loop (1230) on a second finger (292B) of the human hand, the second sensor element (1212) is positioned adjacent to the second finger (292B) of the human hand, and the compressible body (1210) is configured so that the compressible body (1210) is positioned between the first and second fingers (292A, 292B) and the compressible body (1210) provides resistance to the movement of the first finger (292A) towards the second finger (292B).
[0019]
19. System according to claim 18, characterized by the fact that the first sensor element (1211) and the second sensor element (1212) are each a passive electromagnetic sensor.
[0020]
20. System, according to claim 13, characterized by the fact that the device is a virtual proxy.
类似技术:
公开号 | 公开日 | 专利标题
BR112012011321B1|2020-10-13|method and system for manual control of a minimally invasive teleoperated auxiliary surgical instrument
US10543050B2|2020-01-28|Method and system for hand presence detection in a minimally invasive surgical system
US9901402B2|2018-02-27|Method and apparatus for hand gesture control in a minimally invasive surgical system
JP6000387B2|2016-09-28|Master finger tracking system for use in minimally invasive surgical systems
BR112012011422B1|2020-09-29|MINIMUM INVASIVE SURGICAL SYSTEM
EP3092968B1|2019-07-24|System for hand presence detection in a minimally invasive surgical system
BR112012011323B1|2020-02-04|minimally invasive surgical system
同族专利:
公开号 | 公开日
JP5702797B2|2015-04-15|
CN104856764A|2015-08-26|
US8682489B2|2014-03-25|
JP2013510672A|2013-03-28|
EP2480157A1|2012-08-01|
CN102596085A|2012-07-18|
US20110118752A1|2011-05-19|
KR101789064B1|2017-10-23|
WO2011060171A1|2011-05-19|
CN102596085B|2015-06-17|
CN104856764B|2017-06-13|
BR112012011321A2|2016-04-19|
EP3097883A1|2016-11-30|
KR20120115487A|2012-10-18|
EP2480157B1|2017-03-22|
引用文献:
公开号 | 申请日 | 公开日 | 申请人 | 专利标题

FR2416094A1|1978-02-01|1979-08-31|Zarudiansky Alain|REMOTE HANDLING DEVICE|
DE4306786C1|1993-03-04|1994-02-10|Wolfgang Daum|Hand-type surgical manipulator for areas hard to reach - has distal components actuated by fingers via Bowden cables|
US5791231A|1993-05-17|1998-08-11|Endorobotics Corporation|Surgical robotic system and hydraulic actuator therefor|
US6110130A|1997-04-21|2000-08-29|Virtual Technologies, Inc.|Exoskeleton device for directly measuring fingertip position and inferring finger joint angle|
WO1998051451A2|1997-05-12|1998-11-19|Virtual Technologies, Inc.|Force-feedback interface device for the hand|
US7472047B2|1997-05-12|2008-12-30|Immersion Corporation|System and method for constraining a graphical hand from penetrating simulated graphical objects|
JP2000132305A|1998-10-23|2000-05-12|Olympus Optical Co Ltd|Operation input device|
US6799065B1|1998-12-08|2004-09-28|Intuitive Surgical, Inc.|Image shifting apparatus and method for a telerobotic system|
US6331181B1|1998-12-08|2001-12-18|Intuitive Surgical, Inc.|Surgical robotic tools, data architecture, and use|
US6424885B1|1999-04-07|2002-07-23|Intuitive Surgical, Inc.|Camera referenced control in a minimally invasive surgical apparatus|
US6809462B2|2000-04-05|2004-10-26|Sri International|Electroactive polymer sensors|
JP2002059380A|2000-08-22|2002-02-26|Olympus Optical Co Ltd|Master-slave device|
US9002518B2|2003-06-30|2015-04-07|Intuitive Surgical Operations, Inc.|Maximum torque driving of robotic surgical tools in robotic surgical systems|
AU2003224365A1|2002-05-27|2003-12-12|Koninklijke Philips Electronics N.V.|Passive data carrier with signal evaluation means for evaluating information of a self-clocking signal|
US7410483B2|2003-05-23|2008-08-12|Novare Surgical Systems, Inc.|Hand-actuated device for remote manipulation of a grasping tool|
US8064985B2|2003-09-12|2011-11-22|Ge Medical Systems Global Technology Company|System and method for determining the position of a flexible instrument used in a tracking system|
JP2005224528A|2004-02-16|2005-08-25|Olympus Corp|Endoscope|
US7386365B2|2004-05-04|2008-06-10|Intuitive Surgical, Inc.|Tool grip calibration for robotic surgery|
ES2262423B1|2005-02-18|2007-11-16|Manuel Fernandez Guerrero|IONIZING RADIATION AUTOMATIC ACTIVATION AND DEACTIVATION SYSTEM CONTROLLED BY THE OPERATOR'S LOOK.|
US8398541B2|2006-06-06|2013-03-19|Intuitive Surgical Operations, Inc.|Interactive user interfaces for robotic minimally invasive surgical systems|
US20070129626A1|2005-11-23|2007-06-07|Prakash Mahesh|Methods and systems for facilitating surgical procedures|
US20070167702A1|2005-12-30|2007-07-19|Intuitive Surgical Inc.|Medical robotic system providing three-dimensional telestration|
US7907166B2|2005-12-30|2011-03-15|Intuitive Surgical Operations, Inc.|Stereo telestration for robotic surgery|
US20090192523A1|2006-06-29|2009-07-30|Intuitive Surgical, Inc.|Synthetic representation of a surgical instrument|
KR101494283B1|2006-06-13|2015-02-23|인튜어티브 서지컬 인코포레이티드|Minimally invasive surgical system|
WO2008042220A2|2006-09-29|2008-04-10|Nellcor Puritan Bennett Llc|User interface and identification in a medical device system and method|
US8682502B2|2007-03-28|2014-03-25|Irobot Corporation|Remote vehicle control system and method|
US20080275367A1|2007-04-23|2008-11-06|Hansen Medical, Inc|Systems and methods for mapping intra-body tissue compliance|
US20090012533A1|2007-04-23|2009-01-08|Hansen Medical, Inc.|Robotic instrument control system|
US20090138025A1|2007-05-04|2009-05-28|Hansen Medical, Inc.|Apparatus systems and methods for forming a working platform of a robotic instrument system by manipulation of components having controllably rigidity|
US9881520B2|2008-01-08|2018-01-30|Immersion Medical, Inc.|Virtual tool manipulation system|
US8830224B2|2008-12-31|2014-09-09|Intuitive Surgical Operations, Inc.|Efficient 3-D telestration for local robotic proctoring|
US9492240B2|2009-06-16|2016-11-15|Intuitive Surgical Operations, Inc.|Virtual measurement tool for minimally invasive surgery|
US9155592B2|2009-06-16|2015-10-13|Intuitive Surgical Operations, Inc.|Virtual measurement tool for minimally invasive surgery|ITPI20040084A1|2004-11-18|2005-02-18|Massimo Bergamasco|PORTABLE APTIC INTERFACE|
US9266239B2|2005-12-27|2016-02-23|Intuitive Surgical Operations, Inc.|Constraint based control in a minimally invasive surgical apparatus|
US8332072B1|2008-08-22|2012-12-11|Titan Medical Inc.|Robotic hand controller|
US10532466B2|2008-08-22|2020-01-14|Titan Medical Inc.|Robotic hand controller|
US8423182B2|2009-03-09|2013-04-16|Intuitive Surgical Operations, Inc.|Adaptable integrated energy control system for electrosurgical tools in robotic surgical systems|
US20110172550A1|2009-07-21|2011-07-14|Michael Scott Martin|Uspa: systems and methods for ems device communication interface|
US8521331B2|2009-11-13|2013-08-27|Intuitive Surgical Operations, Inc.|Patient-side surgeon interface for a minimally invasive, teleoperated surgical instrument|
CN102934142B|2010-04-09|2016-10-05|卓尔医学产品公司|System and method for EMS device communication interface|
US8996173B2|2010-09-21|2015-03-31|Intuitive Surgical Operations, Inc.|Method and apparatus for hand gesture control in a minimally invasive surgical system|
US8935003B2|2010-09-21|2015-01-13|Intuitive Surgical Operations|Method and system for hand presence detection in a minimally invasive surgical system|
US20120131513A1|2010-11-19|2012-05-24|Microsoft Corporation|Gesture Recognition Training|
EP2617530B1|2010-11-30|2015-11-18|Olympus Corporation|Master operation input device and master-slave manipulator|
US20120224040A1|2011-03-03|2012-09-06|Hand Held Products, Inc.|Imager reader with hand gesture interface|
US20130060166A1|2011-09-01|2013-03-07|The Regents Of The University Of California|Device and method for providing hand rehabilitation and assessment of hand function|
US9924907B2|2011-09-30|2018-03-27|Google Technology Holdings LLC|Method and system for identifying location of a touched body part|
US9241770B2|2011-09-30|2016-01-26|Ethicon Endo-Surgery, Inc.|Methods and devices for remotely controlling movement of surgical tools|
US9014850B2|2012-01-13|2015-04-21|Toyota Motor Engineering & Manufacturing North America, Inc.|Methods and computer-program products for evaluating grasp patterns, and robots incorporating the same|
KR101929451B1|2012-02-03|2018-12-14|삼성전자주식회사|Controlling apparatus and method for robot|
CN104105577B|2012-02-15|2017-06-27|直观外科手术操作公司|Use pattern distinguishes user's selection of the robot system operator scheme of Operator action|
US9445876B2|2012-02-27|2016-09-20|Covidien Lp|Glove with sensory elements incorporated therein for controlling at least one surgical instrument|
JP6053358B2|2012-07-03|2016-12-27|オリンパス株式会社|Surgery support device|
KR101997566B1|2012-08-07|2019-07-08|삼성전자주식회사|Surgical robot system and control method thereof|
KR102328291B1|2012-08-15|2021-11-19|인튜어티브 서지컬 오퍼레이션즈 인코포레이티드|Movable surgical mounting platform controlled by manual motion of robotic arms|
EP2895098A4|2012-09-17|2016-05-25|Intuitive Surgical Operations|Methods and systems for assigning input devices to teleoperated surgical instrument functions|
EP2901368A4|2012-09-28|2016-05-25|Zoll Medical Corp|Systems and methods for three-dimensional interaction monitoring in an ems environment|
DE102012110190B4|2012-10-25|2015-03-26|Mis-Robotics Gmbh|Manually operated robot control and method for controlling a robot system|
US10864048B2|2012-11-02|2020-12-15|Intuitive Surgical Operations, Inc.|Flux disambiguation for teleoperated surgical systems|
US10631939B2|2012-11-02|2020-04-28|Intuitive Surgical Operations, Inc.|Systems and methods for mapping flux supply paths|
WO2014093822A1|2012-12-14|2014-06-19|Abb Technology Ag|Bare hand robot path teaching|
US8989902B1|2013-03-05|2015-03-24|U.S. Department Of Energy|User interface for a tele-operated robotic hand system|
US9566414B2|2013-03-13|2017-02-14|Hansen Medical, Inc.|Integrated catheter and guide wire controller|
EP3900641A4|2013-03-14|2021-10-27|Sri int inc|Wrist and grasper system for a robotic tool|
WO2014151621A1|2013-03-15|2014-09-25|Sri International|Hyperdexterous surgical system|
US10849702B2|2013-03-15|2020-12-01|Auris Health, Inc.|User input devices for controlling manipulation of guidewires and catheters|
US9283046B2|2013-03-15|2016-03-15|Hansen Medical, Inc.|User interface for active drive apparatus with finite range of motion|
US11020016B2|2013-05-30|2021-06-01|Auris Health, Inc.|System and method for displaying anatomy and devices on a movable display|
EP2932933B1|2013-07-22|2017-11-01|Olympus Corporation|Medical portable terminal device|
DE102013108228A1|2013-07-31|2015-02-05|MAQUET GmbH|Assistance device for the imaging support of an operator during a surgical procedure|
KR102237597B1|2014-02-18|2021-04-07|삼성전자주식회사|Master device for surgical robot and control method thereof|
EP3243476B1|2014-03-24|2019-11-06|Auris Health, Inc.|Systems and devices for catheter driving instinctiveness|
DE102014006264A1|2014-04-30|2015-11-05|gomtec GmbH|Method and device for controlling the movement of an object|
EP3139843A4|2014-05-05|2018-05-30|Vicarious Surgical Inc.|Virtual reality surgical device|
KR20170041698A|2014-08-12|2017-04-17|인튜어티브 서지컬 오퍼레이션즈 인코포레이티드|Detecting uncontrolled movement|
US9815206B2|2014-09-25|2017-11-14|The Johns Hopkins University|Surgical system user interface using cooperatively-controlled robot|
US9811555B2|2014-09-27|2017-11-07|Intel Corporation|Recognition of free-form gestures from orientation tracking of a handheld or wearable device|
KR20170083091A|2014-11-13|2017-07-17|인튜어티브 서지컬 오퍼레이션즈 인코포레이티드|Integrated user environments|
CN104503576A|2014-12-22|2015-04-08|山东超越数控电子有限公司|Computer operation method based on gesture recognition|
CN104535088B|2015-01-20|2017-08-29|上海电气集团股份有限公司|A kind of finger ring type sensor device for carrier|
CN107073704A|2015-02-25|2017-08-18|奥林巴斯株式会社|Arm-and-hand system and medical system|
CN107708595B|2015-04-23|2020-08-04|Sri国际公司|Ultra-dexterous surgical system user interface device|
FR3036302B1|2015-05-20|2017-06-02|Commissariat A L`Energie Atomique Et Aux Energies Alternatives|TELEOPERATED MANUAL WELDING METHOD AND WELDING ROBOT USING SUCH A METHOD|
US10828115B2|2015-07-23|2020-11-10|Sri International|Robotic arm and robotic surgical system|
EP3342553A4|2015-08-25|2019-08-07|Kawasaki Jukogyo Kabushiki Kaisha|Information sharing system and information sharing method for sharing information between multiple robot systems|
JPWO2017038836A1|2015-08-28|2018-06-14|国立大学法人九州大学|Robot hand and master for operating it|
AU2016341284A1|2015-10-22|2018-04-12|Covidien Lp|Variable sweeping for input devices|
WO2017158627A1|2016-03-18|2017-09-21|Council Of Scientific & Industrial Research|A device for sensing the pose & motion of a human's arm-hand|
CN105832419A|2016-05-06|2016-08-10|济南创泽生物医药科技有限公司|Micro-precision accurate surgical robot|
EP3481297A1|2016-07-06|2019-05-15|Koninklijke Philips N.V.|Measuring a length of movement of an elongate intraluminal device|
US11037464B2|2016-07-21|2021-06-15|Auris Health, Inc.|System with emulator movement tracking for controlling medical devices|
US10099368B2|2016-10-25|2018-10-16|Brandon DelSpina|System for controlling light and for tracking tools in a three-dimensional space|
EP3324270A1|2016-11-16|2018-05-23|Thomson Licensing|Selection of an object in an augmented reality environment|
WO2018098444A1|2016-11-28|2018-05-31|Verb Surgical Inc.|Robotic surgical system to reduce unwanted vibration|
CN108297091A|2017-01-12|2018-07-20|上银科技股份有限公司|The method of auto-changing directional type position control|
EP3579736A4|2017-02-09|2020-12-23|Vicarious Surgical Inc.|Virtual reality surgical tools system|
CN106873787A|2017-04-10|2017-06-20|武汉大学|A kind of gesture interaction system and method for virtual teach-in teaching|
US10856948B2|2017-05-31|2020-12-08|Verb Surgical Inc.|Cart for robotic arms and method and apparatus for registering cart to surgical table|
US10485623B2|2017-06-01|2019-11-26|Verb Surgical Inc.|Robotic arm cart with fine position adjustment features and uses therefor|
CN107220507B|2017-06-06|2021-04-13|吕煜|Remote medical control device and control method|
US10913145B2|2017-06-20|2021-02-09|Verb Surgical Inc.|Cart for robotic arms and method and apparatus for cartridge or magazine loading of arms|
US20200222138A1|2017-07-06|2020-07-16|Intuitive Surgical Operations, Inc.|Systems and methods for haptic feedback in selection of menu items in a teleoperational system|
US11141160B2|2017-10-30|2021-10-12|Cilag Gmbh International|Clip applier comprising a motor controller|
US11229436B2|2017-10-30|2022-01-25|Cilag Gmbh International|Surgical system comprising a surgical tool and a surgical hub|
US11103268B2|2017-10-30|2021-08-31|Cilag Gmbh International|Surgical clip applier comprising adaptive firing control|
WO2019113391A1|2017-12-08|2019-06-13|Auris Health, Inc.|System and method for medical instrument navigation and targeting|
US10944728B2|2017-12-28|2021-03-09|Ethicon Llc|Interactive surgical systems with encrypted communication capabilities|
US20190205001A1|2017-12-28|2019-07-04|Ethicon Llc|Sterile field interactive control displays|
US10943454B2|2017-12-28|2021-03-09|Ethicon Llc|Detection and escalation of security responses of surgical instruments to increasing severity threats|
US11056244B2|2017-12-28|2021-07-06|Cilag Gmbh International|Automated data scaling, alignment, and organizing based on predefined parameters within surgical networks|
US11045591B2|2017-12-28|2021-06-29|Cilag Gmbh International|Dual in-series large and small droplet filters|
US11076921B2|2017-12-28|2021-08-03|Cilag Gmbh International|Adaptive control program updates for surgical hubs|
US11257589B2|2017-12-28|2022-02-22|Cilag Gmbh International|Real-time analysis of comprehensive cost of all instrumentation used in surgery utilizing data fluidity to track instruments through stocking and in-house processes|
US11202570B2|2017-12-28|2021-12-21|Cilag Gmbh International|Communication hub and storage device for storing parameters and status of a surgical device to be shared with cloud based analytics systems|
US10849697B2|2017-12-28|2020-12-01|Ethicon Llc|Cloud interface for coupled surgical devices|
US11234756B2|2017-12-28|2022-02-01|Cilag Gmbh International|Powered surgical tool with predefined adjustable control algorithm for controlling end effector parameter|
US10892995B2|2017-12-28|2021-01-12|Ethicon Llc|Surgical network determination of prioritization of communication, interaction, or processing based on system or device needs|
US11166772B2|2017-12-28|2021-11-09|Cilag Gmbh International|Surgical hub coordination of control and communication of operating room devices|
US11100631B2|2017-12-28|2021-08-24|Cilag Gmbh International|Use of laser light and red-green-blue coloration to determine properties of back scattered light|
US11179208B2|2017-12-28|2021-11-23|Cilag Gmbh International|Cloud-based medical analytics for security and authentication trends and reactive measures|
US20190274716A1|2017-12-28|2019-09-12|Ethicon Llc|Determining the state of an ultrasonic end effector|
US11147607B2|2017-12-28|2021-10-19|Cilag Gmbh International|Bipolar combination device that automatically adjusts pressure based on energy modality|
US11013563B2|2017-12-28|2021-05-25|Ethicon Llc|Drive arrangements for robot-assisted surgical platforms|
US20190206551A1|2017-12-28|2019-07-04|Ethicon Llc|Spatial awareness of surgical hubs in operating rooms|
US10892899B2|2017-12-28|2021-01-12|Ethicon Llc|Self describing data packets generated at an issuing instrument|
US10987178B2|2017-12-28|2021-04-27|Ethicon Llc|Surgical hub control arrangements|
US10758310B2|2017-12-28|2020-09-01|Ethicon Llc|Wireless pairing of a surgical device with another device within a sterile surgical field based on the usage and situational awareness of devices|
US10932872B2|2017-12-28|2021-03-02|Ethicon Llc|Cloud-based medical analytics for linking of local usage trends with the resource acquisition behaviors of larger data set|
US11069012B2|2017-12-28|2021-07-20|Cilag Gmbh International|Interactive surgical systems with condition handling of devices and data capabilities|
US11109866B2|2017-12-28|2021-09-07|Cilag Gmbh International|Method for circular stapler control algorithm adjustment based on situational awareness|
US11132462B2|2017-12-28|2021-09-28|Cilag Gmbh International|Data stripping method to interrogate patient records and create anonymized record|
US11253315B2|2017-12-28|2022-02-22|Cilag Gmbh International|Increasing radio frequency to create pad-less monopolar loop|
US10966791B2|2017-12-28|2021-04-06|Ethicon Llc|Cloud-based medical analytics for medical facility segmented individualization of instrument function|
US20190201087A1|2017-12-28|2019-07-04|Ethicon Llc|Smoke evacuation system including a segmented control circuit for interactive surgical platform|
US20190201129A1|2017-12-28|2019-07-04|Ethicon Llc|Image capturing of the areas outside the abdomen to improve placement and control of a surgical device in use|
US11160605B2|2017-12-28|2021-11-02|Cilag Gmbh International|Surgical evacuation sensing and motor control|
US11051876B2|2017-12-28|2021-07-06|Cilag Gmbh International|Surgical evacuation flow paths|
US10695081B2|2017-12-28|2020-06-30|Ethicon Llc|Controlling a surgical instrument according to sensed closure parameters|
US11213359B2|2017-12-28|2022-01-04|Cilag Gmbh International|Controllers for robot-assisted surgical platforms|
US20190201146A1|2017-12-28|2019-07-04|Ethicon Llc|Safety systems for smart powered surgical stapling|
US11096693B2|2017-12-28|2021-08-24|Cilag Gmbh International|Adjustment of staple height of at least one row of staples based on the sensed tissue thickness or force in closing|
US11266468B2|2017-12-28|2022-03-08|Cilag Gmbh International|Cooperative utilization of data derived from secondary sources by intelligent surgical hubs|
US20200390512A1|2018-02-21|2020-12-17|Intuitive Surgical Operations, Inc.|Systems and methods for automatic grip adjustment during energy delivery|
US11259830B2|2018-03-08|2022-03-01|Cilag Gmbh International|Methods for controlling temperature in ultrasonic device|
US11219453B2|2018-03-28|2022-01-11|Cilag Gmbh International|Surgical stapling devices with cartridge compatible closure and firing lockout arrangements|
US11090047B2|2018-03-28|2021-08-17|Cilag Gmbh International|Surgical instrument comprising an adaptive control system|
US11207067B2|2018-03-28|2021-12-28|Cilag Gmbh International|Surgical stapling device with separate rotary driven closure and firing systems and firing member that engages both jaws while firing|
US11197668B2|2018-03-28|2021-12-14|Cilag Gmbh International|Surgical stapling assembly comprising a lockout and an exterior access orifice to permit artificial unlocking of the lockout|
US11166716B2|2018-03-28|2021-11-09|Cilag Gmbh International|Stapling instrument comprising a deactivatable lockout|
US11213294B2|2018-03-28|2022-01-04|Cilag Gmbh International|Surgical instrument comprising co-operating lockout features|
US20190298350A1|2018-03-28|2019-10-03|Ethicon Llc|Methods for controlling a powered surgical stapler that has separate rotary closure and firing systems|
US11096688B2|2018-03-28|2021-08-24|Cilag Gmbh International|Rotary driven firing members with different anvil and channel engagement features|
US10973520B2|2018-03-28|2021-04-13|Ethicon Llc|Surgical staple cartridge with firing member driven camming assembly that has an onboard tissue cutting feature|
US11185376B2|2018-04-09|2021-11-30|Rowan University|Robot for placement of spinal instrumentation|
JP2021522894A|2018-05-18|2021-09-02|オーリス ヘルス インコーポレイテッド|Controller for robot-enabled remote control system|
US11135031B2|2018-06-15|2021-10-05|Verb Surgical Inc.|User interface device having grip linkages|
US11135030B2|2018-06-15|2021-10-05|Verb Surgical Inc.|User interface device having finger clutch|
US11259807B2|2019-02-19|2022-03-01|Cilag Gmbh International|Staple cartridges with cam surfaces configured to engage primary and secondary portions of a lockout of a surgical stapling device|
DE102019118012B3|2019-07-03|2020-09-17|Günther Battenberg|Method and device for controlling a robot system by means of human movement|
CN110638529B|2019-09-20|2021-04-27|和宇健康科技股份有限公司|Operation remote control method and device, storage medium and terminal equipment|
RU2718568C1|2019-11-25|2020-04-08|Ассистирующие Хирургические Технологии , Лтд|Wrist controller for use in operator's robot-surgery system controller|
RU2716353C1|2019-11-25|2020-03-11|Ассистирующие Хирургические Технологии , Лтд|Hand controller for use in robot surgery system operator's controller|
法律状态:
2018-03-27| B15K| Others concerning applications: alteration of classification|Ipc: B25J 13/02 (2006.01), A61B 34/30 (2016.01), A61B 3 |
2018-05-15| B15K| Others concerning applications: alteration of classification|Ipc: G06F 3/01 (2006.01), A61B 17/00 (2006.01), B25J 13 |
2019-01-08| B06F| Objections, documents and/or translations needed after an examination request according [chapter 6.6 patent gazette]|
2019-07-30| B06U| Preliminary requirement: requests with searches performed by other patent offices: procedure suspended [chapter 6.21 patent gazette]|
2020-05-19| B09A| Decision: intention to grant [chapter 9.1 patent gazette]|
2020-10-13| B16A| Patent or certificate of addition of invention granted [chapter 16.1 patent gazette]|Free format text: PRAZO DE VALIDADE: 20 (VINTE) ANOS CONTADOS A PARTIR DE 11/11/2010, OBSERVADAS AS CONDICOES LEGAIS. |
优先权:
申请号 | 申请日 | 专利标题
US12/617,937|2009-11-13|
US12/617,937|US8521331B2|2009-11-13|2009-11-13|Patient-side surgeon interface for a minimally invasive, teleoperated surgical instrument|
US12/887,091|2010-09-21|
US12/887,091|US8682489B2|2009-11-13|2010-09-21|Method and system for hand control of a teleoperated minimally invasive slave surgical instrument|
PCT/US2010/056383|WO2011060171A1|2009-11-13|2010-11-11|Method and system for hand control of a teleoperated minimally invasive slave surgical instrument|
[返回顶部]